DEV Community

Jordan Tingling
Jordan Tingling

Posted on • Edited on

Cloud Resume Challenge - My Journey

Who Am I?

I'm a Cloud enthusiast (AWS primarily) and student at the University of London (BSc Computer Science). I do have experience in front-end and back-end engineering, however, this was a step in a new direction that allowed me to get my hands really cloudy with AWS.

My journey through learning AWS was surely an impulsive one, It began with ACG A Cloud Guru which I pay massive respect to for starting my journey into the cloud. I stumbled upon the Cloud Resume Challenge through (Youtube/Linkedin/Reddit) by Forest Brazeal (Cloud Architect, AWS Serverless Hero). Here is some information about the challenge here.

I am definitely late to the party, (the last date set by Forrest Brazeal on this challenge was 31-July-2020) due to my preparation for AWS certifications, however, I still wanted to give it a try. I would also like to thank Alex Eversmeyer, as I referred his blog and Github repos, whenever I needed assistance.

Let's Get To It!

AWS Certification

Being new to the world of AWS and cloud computing, I achieved the AWS Certified Practioner Certification in Sept 2021. I then went on to achieve the AWS Certified Solutions Architect Associate (Nov 2021), and AWS Certified Developer Associate (Dec 2021), currently, I am preparing for the AWS Certified Solution Architect Professional. Massive shoutout to Adrian Cantrill for his in-depth and hands-on courses and labs. ACG gave me the introduction to the cloud, however, Adrian pushed me even further into understanding AWS technologies at a deeper level.

Front-End (HTML/CSS/JS)

I tried to keep it as minimalistic as possible. I have front-end experience, however, I wanted to focus more on the technologies which make up the bulk of this challenge and not tinkering with CSS all day long. The HTML page would also be linked to a JS file that contains the JavaScript snippet which will update and fetch the visitor count from the "back end" (DynamoDB, Lambda, API Gateway).

Static S3 Website

Based on the knowledge I acquired during the preparation of AWS certifications, hosting a static website on S3 and using CloudFront for content distribution was easy (CloudFront even provides you with an automatic "CloudFront SSL Certification". I purchased a domain name using AWS Route 53 and set up my hosted zone and configured it to use the CloudFront distribution. Also, I made use of AWS Certification Manager(ACM) to procure my personal SSL certificate for the site.

Here is a brief overview of the architecture, Thanks Luis Nuñez for the diagram.

Image description

Back-End

Image description

The backend infrastructure and logic were needed to update and retrieve the "visitor count" from a database table (DynamoDB table) that is needed for the front-end to display this count on the website. The backend involved the use of AWS resources like API Gateway, Lambda, and DynamoDB.

  • DynamoDB: DynamoDB is AWS's fast and flexible NoSQL Database service offering for any scale. I created a simple DynamoDB table with one item to store and update the visitor count. A snipped of Python code comes in handy here. The numeric value increments each time you call the DynamoDB table. This operation is implemented in the Lambda function (more details below).

  • Lambda: AWS Lambda is a compute service that lets you run code without provisioning or managing servers. I created a python based Lambda function, which queries the DynamoDB table and updates the visitor count item. I utilized an API call which is made to an API gateway to increment the numeric value (add a visitor counter).

  • API Gateway: Amazon API Gateway provides the option to create and manage APIs to back-end systems running on Amazon EC2, AWS Lambda, or any publicly addressable web service. In my case, the API Gateway exposes a REST API Endpoint, which will be called by a Javascript snippet embedded in the front-end HTML page, on every page visit/refresh - to update and fetch the visitor count from the DynamoDB table, through the Lambda function. Enabling CORS (Cross-Origin Resource Sharing) on the API Gateway resource is mandatory to fetch the response back when it is called.

Summary

The concept is quite simple: Each time the website is requested, an API call is made to an API gateway, which in turn maps to a lambda function. This lambda function does nothing else than increment a field in a DynamoDB table. The new counter is returned and displayed on the web page.

Back-End (Infrastructure as Code)

Even though the backend is relatively simple and can be assembled in a few minutes in the AWS Console, it's still not the most convenient way, especially if you need to update or delete something. Fortunately, AWS has our back and offers the ability to provision our cloud infrastructure in a number of ways. In this case, we use AWS Serverless Application Model (SAM), an Infrastructure as Code (IaC) solution that allows us to define and build our Serverless application using a single YAML template. Among other things, this allows us to keep and update our infrastructure together with our codebase and, most importantly, to create an application infrastructure that can be reproduced over and over again.

Here is a brief overview of the architecture, Thanks AWS Community Builders for the diagram.

My architecture is slightly different in this case, however, this is a general overlook of similar architecture.

Image description

Front-End - CI/CD

Even though AWS SAM takes some of the work off my shoulders, currently updating the website still means some manual effort. AWS SAM has to be installed locally, then, the application has to be built and then deployed. All this costs time and should therefore be automated. For front-end CI/CD, I used GitHub Actions to configure AWS credentials, deploy the changes to the S3 bucket (which stores HTML/CSS/Js/images content), and then invalidate the CloudFront distribution. This deployment workflow can then be configured to execute on each push to the master branch, updating the entire application stack without any additional intervention by me.

Back-End - CI/CD

A separate GitHub repo was created to store the back-end code, which included the Lambda function, SAM Template, and Python tests. Corresponding GitHub action is created to configure AWS credentials, run python tests, SAM builds and SAM deploy commands. AWS credentials are configured to securely store environment variables such as AWS Login Keys. These credentials are retrieved from secrets defined in the GitHub repository. After that, the same steps are performed that would also be executed locally on the developer's PC: build the application with npm and push the folder to S3 via the AWS CLI. Afterward, the CloudFront cache is invalidated so that pushed changes are displayed immediately.

Conclusion

Image description

🏅 That's it! The challenge was great, it covers many AWS Services and how to use them together along with programming languages (Js/Python). Overall, it was a great experience, challenging and frustrating at times, especially regarding AWS SAM but it was worth it.

Big shoutout to: AWS Community Builders for their extremely helpful resources.

Here is the final version of my resume:
Jordan_Tingling_Resume

Top comments (1)

Collapse
 
gnarlylasagna profile image
Evan Dolatowski

This is great, I had a great time completing the Cloud Resume Challenge using Azure! Thank you for sharing your experience