Computer networking is all around us. From the days of dial up internet, internet connected devices have grown rapidly on the global scale. Despite this, across the globe there are broadly three types of computer network - LAN/MAN and WAN. The above diagram does a good example of representing the geographical elements of these networks. But it is worth elaborating bit further. LAN networks are usually used in office locations and connected by physical cables and local wifi. MAN networks are typically interconnected LAN networks with a private network link and a connection to the internet. MAN use cases are mostly reserved for governmental or state institutions, along with huge corporates - I'd imagine Alphabet Inc in Silicon Valley might use this for example. Whilst WAN connects devices in a large geographic area, across multiple cities or countries. WAN's are used to connect LANs. You could argue that the biggest WAN in the world is the internet itself. Some of the fundamen...
I'm beyond lucky to have been a successful applicant onto the Amazon re/Start programme. This week on our intensive bootcamp we've been focusing on manoeuvring around the command line interface on Linux. Due to being computer active before YouTube (the delightful days of dial-up internet!) this wasn't all too unfamiliar to me for three specific reasons; Internet Relay Chat , or as the acronym goes - IRC. Before the endless content on YouTube, if you wanted to have a copy of a music video you either had to rely on the artist putting a movie file on a CD single or you relied on a fan club setting a server/channel up on IRC. To get a music video on IRC, the wizkids of the era set-up a system identical to a Linux CLI. So there I was at 13, deploying my dir and get commands to get my own personal slice of MTV heaven! Raspberry Pi Inspired by the super cool RetroFlag SNES case for the RaspberryPi and all amazing bespoke retro arcades I saw people posting online, I went ahead ...
Security is the practice of protecting valuable assets. When we talk about information security, cyber security is actually just a subset of this discipline. Cybersecurity is focused on protecting digital related devices – like networks, systems and digital information. Focus on stopping unauthorised access, malicious actions like theft, destruction or alteration or simply disrupting the service. One of the most popular models for thinking about information security is the CIA triad. Confidentiality, Integrity and Availability. Threats can come in many different forms. The tables below provide some a variety and most interestingly also explore what part of the Triad is impacted. So, working with the security lifecycle model we can determine what actions we can take at each stage to deliver a robust security infrastructure within our organisation. Prevention Identify assets – what devices are you using? Are they contemporary? Can you flash newer firmware on them,...
Operating a computer with BASH is the reality the end-user faced prior to Window's graphical user interface. In the end however, what we're actually doing is rather much the same. By deploying actions via a CLI, we're empowered in our agency in what we can schedule and automate in the environment. Ensuring we're starting off at the right directory is key, for that we use; pwd This outputs are current directory which informs us were we're currently sitting at in the system. A basic rule of command syntax is that is delivered in below format; command -> option - > argument eg. sudo usermod -a -G Sales arosalez sudo usermod -> -a -G -> Sales arosalez The most basic commands revolve around providing quick user or system data. I won't elaborate on the most of these as they are either ubiquitous or pretty self evident. ls, cd, whoami, id, hostname, uptime, date cal, clear, echo, history, touch, cat Don't forget to use the tab autocomplete function...
Due to having an industry connection, I had the opportunity to do something really fun this weekend. The application running on AWS is a CRUD API - however, Datadog, a Cloud Analytics & Monitoring agent has been installed to provide feedback on what's actually happening. I found this so beyond awesome, the scope of this product is really impressive. On the left side you can see a command line running on my Linux VM. On the right you can see a live monitoring of the AWS machine, tracking the inputs received on the API application. As I send commands into the CRUDE API application, DataDog continues monitoring and updating our dashboard. If you can imagine for the deployment of a much busier service, the possibilities of this kind of real-time monitoring isn't to be underestimated. Above you can see the CPU usage graphs, but unlike the ones you find on the AWS console, you can see that DataDog manages to also show us the split of the CPU usage. Even in th...
Thanks to completing an IT GNVQ during my secondary school days, databases aren’t all that new to me. I had experience querying a relational database in Microsoft Access already. But for the purpose of this post, I’ll revisit some core principles. The first to example is the types of database, which I’ll use an infographic to explain; Now what we’ll be looking this post is the relational model. This works by having a series of tables linked by public and foreign keys. Each of these keys has to be completely unique. When we update our database, we term this as a transaction. This means one or more changes are performed to a database. To commit a change we need to ensure transactions follow the four standard principles of atomicity, consistency, isolation, and durability. Before we go into talking about querying our relational database, let’s get into the NULL value. When searching we can use IS NULL and IS NOT NULL, but beyond this, there are several useful things to know about this v...
Having signed up to the AWS Skill Builder I thought it was something I'd be looking at post-programme. However the above builder labs email landed, I thought, what better way to spend my Sunday afternoon? So the fundamentals lab was working with the AWS service CodeCommit. Having gone through most of gitimmersion this lab felt very straight forward. It involved creating a local and "remote" (ie. CloudCommit) git repository and synchronising them together. The intermediate lab now proved a little more tricky. First of all was to comprehend that we were going to build a CI/CD pipeline - ie. manifesting the principles of DevOps. Below I've posted a full scale CI/CD pipeline to demonstrate the complexity. However, in this lab we're only using CodeCommit, CodeBuild, CodeDeploy, AWS ECR and ECS. This lab involved the most complex technology stack I've come across thus far, so really appreciated doing something a little more challenging. However, it did involve so...
Today we had a recent AWS re/Start Graduate join us for a short presentation and Q&A about their experience both about the programme and their post-programme period before starting a role as a Cloud Engineer. It was great to hear from someone who less then 12 months ago had also undergone the same exact programme we're a third of the way through now. We got an insight into their day to day (lots of meetings!), how long it took until they were allowed to pick up tickets independently (two months) and how much did their employer support their development (lots!). By chance, they were scheduled to complete the solutions AWS solutions architect certification tomorrow! I found it incredibly motivating and centring to hear their story. I made sure to ask a question myself, which revolved around finding potential roles and then knowing the essential skills that we might need to enhance our employability. The advice provided was; Make sure to continue studying Networking. This ...
To aid in moving more businesses into the cloud AWS have developed the Cloud Adoption Framework (CAF). There are also competitors’ models, but in the broadest sense they tend to follow the same principles and categories. I’ll start with showing broad categories, then the Azure model and the AWS model. So now we've familiarised ourselves with the broad categories, we can see the two major cloud providers high-level frameworks presented; Now the 2nd infographic of the AWS CAF begins to show more detail, exposing us to the genuine complexity behind the considerations required during the CAF process. If a business is consequently adopting a cloud, well, it also makes sense to have a structure to think about how the future architecture will be designed and if it matches cloud best practice, in other words AWS's Well-Architected Framework (WAF). Thankfully like many of my posts, there are infographics that excellently and succinctly summaries these ideas for us; It is very likely th...
For our very last lab about the network, we were given the following customer environment to build. By virtue of the fact the AWS management console GUI is constantly updated it meant that our task instructions were wholly out of date. However, this actually proved to only make it a far more involved and fun process - necessitating lots of troubleshooting when things had not spun up as expected. I felt like I had learnt a lot more this way and for the benefit of my future self, I thought I would make some step by step notes (not necessarily in reference to the diagram above) to serve as prompts for any future cloud VPC's I'll be spinning up. Our challenge was to build the VPC as the diagram above. Create an elastic IP. We will associate this later with our NAT gateway. Launch your VPC with a private class IP range without forgetting to specific how many availability zones (AZ) you want. Create and label your subnets, again specifying what AZ, as none are public at the ...