<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://ericnbello.github.io/feed.xml" rel="self" type="application/atom+xml" /><link href="https://ericnbello.github.io/" rel="alternate" type="text/html" /><updated>2026-04-14T22:32:45+00:00</updated><id>https://ericnbello.github.io/feed.xml</id><title type="html">Eric Bello</title><subtitle>IT and infrastructure professional with a background in computer engineering. I build practical solutions and keep things running.</subtitle><author><name>Eric Bello</name></author><entry><title type="html">Cloud Resume Challenge: Deploying a Serverless Website on AWS</title><link href="https://ericnbello.github.io/cloud/cloud-resume-challenge/" rel="alternate" type="text/html" title="Cloud Resume Challenge: Deploying a Serverless Website on AWS" /><published>2026-04-14T00:00:00+00:00</published><updated>2026-04-14T00:00:00+00:00</updated><id>https://ericnbello.github.io/cloud/cloud-resume-challenge</id><content type="html" xml:base="https://ericnbello.github.io/cloud/cloud-resume-challenge/"><![CDATA[<blockquote>
  <p>Applying AWS knowledge to deploy a serverless website using S3, CloudFront, DynamoDB, Lambda, API Gateway, SAM CLI, and GitHub Actions.</p>
</blockquote>

<hr />

<p>After passing the AWS Certified Cloud Practitioner exam, I was ready to apply the knowledge I had learned to some practical experience. Researching cloud projects online led me to the Cloud Challenge, which encompassed a wide range of tasks using various AWS services. I began by customizing my portfolio site (which is built with Next.js and Tailwind) and getting it deployed without using “easier” deployment sites like Netlify or Heroku. The following steps outline how I was able to integrate popular AWS services and complete the challenge.</p>

<hr />

<h2 id="table-of-contents">Table of Contents</h2>

<ol>
  <li><a href="#1-deploy-online-as-an-amazon-s3-static-website">Deploy online as an Amazon S3 static website</a></li>
  <li><a href="#2-aws-cloudfront-for-https">AWS CloudFront for HTTPS</a></li>
  <li><a href="#3-dns-and-using-custom-domain">DNS and using custom domain</a></li>
  <li><a href="#4-visitor-counter---javascript">Visitor Counter - JavaScript</a></li>
  <li><a href="#5-dynamodb-database-to-store-count-value">DynamoDB Database to Store Count Value</a></li>
  <li><a href="#6-python-lambda-function-and-api-gateway">Python Lambda function and API Gateway</a></li>
  <li><a href="#7-infrastructure-as-code-iac---cloudformation-with-sam-cli">Infrastructure as Code (IaC) - CloudFormation with SAM CLI</a></li>
  <li><a href="#8-source-control-with-git">Source Control with Git</a></li>
  <li><a href="#9-cicd-with-github-actions-for-front-and-back-end">CI/CD with GitHub Actions for Front and Back End</a></li>
  <li><a href="#final-thoughts">Final Thoughts</a></li>
</ol>

<hr />

<h2 id="1-deploy-online-as-an-amazon-s3-static-website">1. Deploy online as an Amazon S3 static website</h2>

<p>The S3 service makes it easy to deploy and host a static site (like this one created with React) with “buckets” or folders containing the site source code. First, I created a publicly accessible S3 bucket that will provide the HTML and CSS resources and images necessary for the site and configured the appropriate website endpoint for my default region (us-east-1).</p>

<h2 id="2-aws-cloudfront-for-https">2. AWS CloudFront for HTTPS</h2>

<p>Since sites hosted with S3 don’t come with SSL/TLS security certificates by default, it’s necessary to make use of the CloudFront CDN service to enable HTTPS traffic for the website. Both services seamlessly integrated by linking my S3 endpoint as the origin to the CloudFront distribution, giving me a secure domain name.</p>

<h2 id="3-dns-and-using-custom-domain">3. DNS and using custom domain</h2>

<p>Although the domain given by CloudFront had a certificate, I wanted to use my own custom one (ericnbello.com) to access the site. While AWS offers its own DNS service called Route 53, I’ve already been more than familiar with editing DNS records for other domains so I chose to remain with the original provider I registered this one with. I edited the domain’s DNS entries to point to the CloudFront distribution and verify ownership, and was able to receive a custom SSL certificate from CloudFront — and my site was up and running.</p>

<h2 id="4-visitor-counter---javascript">4. Visitor Counter - JavaScript</h2>

<p>Being built in Next.js, the website already had plenty of JavaScript. So my approach here was to create a simple <code class="language-plaintext highlighter-rouge">VisitorCounter</code> component and render it in the footer of each page. I didn’t get fancy — a slim border to create a box around the view count was sufficient.</p>

<h2 id="5-dynamodb-database-to-store-count-value">5. DynamoDB Database to Store Count Value</h2>

<p>The visitor counter needed a database to store and update the visitor count value. Amazon’s DynamoDB, a NoSQL database service, perfectly suited this requirement. Being on the AWS Free Tier, I opted for on-demand pricing to have zero cost right now while also ensuring minimal ongoing costs once the initial free tier year ends. I created a table with a partition key and an attribute name of <code class="language-plaintext highlighter-rouge">Quantity</code> whose initial value of <code class="language-plaintext highlighter-rouge">0</code> would be updated with each visit.</p>

<h2 id="6-python-lambda-function-and-api-gateway">6. Python Lambda function and API Gateway</h2>

<p>Next up was getting the component on the site to communicate with the DynamoDB database. Amazon’s API Gateway and Lambda services worked hand in hand to achieve this, ensuring smooth interaction between my site and database. Since I enjoy the simplicity of Python syntax, I used it for the Lambda function along with the <code class="language-plaintext highlighter-rouge">boto3</code> library. I relied heavily on the AWS documentation for working with DynamoDB and updating table values. When the function was completed, I had a functional API to work with in API Gateway and was able to start handling requests from the web app after resolving some CORS issues.</p>

<h2 id="7-infrastructure-as-code-iac---cloudformation-with-sam-cli">7. Infrastructure as Code (IaC) - CloudFormation with SAM CLI</h2>

<p>In order to avoid manual configuration within the AWS console (which is easy to do incorrectly), I defined the necessary resources — including DynamoDB, API Gateway, and Lambda — using an AWS Serverless Application Model (SAM) template. It uses CloudFormation to deploy the resources, which I really enjoyed learning about in depth. The AWS SAM CLI facilitated smooth deployment from inside my VSCode terminal.</p>

<h2 id="8-source-control-with-git">8. Source Control with Git</h2>

<p>To ensure seamless updates and eliminate manual intervention, I employed source control for both the back-end API and front-end website in a GitHub repository.</p>

<h2 id="9-cicd-with-github-actions-for-front-and-back-end">9. CI/CD with GitHub Actions for Front and Back End</h2>

<p>To maintain a consistent deployment process, I used GitHub Actions to set up continuous integration and deployment (CI/CD) for both the front and back-end code. Whenever I pushed updates to the website code, SAM template, or Python code, GitHub Actions automatically ran — updating the S3 bucket and deploying the SAM application to AWS. This ensured a streamlined and efficient development workflow. Additionally, I made sure the CloudFront cache was invalidated to reflect the latest changes. It’s important to note that I followed best practices and refrained from committing AWS credentials to source control.</p>

<hr />

<h2 id="final-thoughts">Final Thoughts</h2>

<p>Completing the Cloud Resume Challenge was an enriching experience that not only demonstrated my skills in AWS but also expanded my abilities in web development, automation, and infrastructure as code. By navigating through certifications, AWS services, JavaScript implementation, database integration, and CI/CD pipelines, I gained a comprehensive understanding of cloud engineering and DevOps practices. This project served as a testament to my ability to deliver professional-grade solutions while leveraging cutting-edge technologies. I’m excited to apply these skills in future roles as a developer, cloud engineer, or DevOps professional.</p>]]></content><author><name>Eric Bello</name></author><category term="cloud" /><category term="aws" /><category term="s3" /><category term="cloudfront" /><category term="dynamodb" /><category term="lambda" /><category term="api gateway" /><category term="sam cli" /><category term="github actions" /><category term="ci/cd" /><summary type="html"><![CDATA[Applying AWS knowledge to deploy a serverless website using S3, CloudFront, DynamoDB, Lambda, API Gateway, SAM CLI, and GitHub Actions.]]></summary></entry></feed>