A Taste of DevOps

A Taste of DevOps

A workflow is triggered when a developer pushes code to the main branch. The workflow has three jobs defined:

  • Provision the AWS infrastructure based on the terraform directory code changes.
  • Build the blog using the static website generator Hugo.
  • Update the S3 bucket with the new contents and invalidate the CloudFront cache.

Article content
Overview

The workflow file can be accessed at the release website


Workflow Jobs

The infra_job checks out the current repository and executes the Terraform workflow in HCP. The HashiCorp—Setup Terraform action installs the Terraform CLI on a GitHub-hosted runner. A few placeholders are passed to the GITHUB_OUTPUTS for later use in subsequent jobs.

jobs:
    Infra_job:
        name: AWS Infrastructure Provisioning with Terraform
        runs-on: ubuntu-latest
        defaults:
          run:
            working-directory: terraform

        # s3_bucket name and cloudfront distribution ids are stored
        # as outputs and will be passed to "deploy_job"

        outputs:
          s3_bucket: ${{ steps.tf_out.outputs.s3 }}
          cf_id: ${{ steps.tf_out.outputs.cfid }}

        steps:
            - name: Checkout Repository
              uses: actions/checkout@v4

            - name: Terraform Workflow
              uses: hashicorp/setup-terraform@v3
              with:
                terraform_version: "1.9.7"
                cli_config_credentials_token: ${{ secrets.TF_API_TOKEN }}     # API_TOKEN for HCP Terrafom

            - name: Terraform Init
              id: init
              run: terraform init

            - name: Terraform Validate
              id: validate
              run: terraform validate -no-color

            - name: Terraform Plan
              id: plan
              run: terraform plan -no-color
              continue-on-error: true

            - name: Terraform Apply
              id: Apply
              run: terraform apply -auto-approve

            - name: Terraform Output
              id: tf_out
              run: |
                echo "s3=$(terraform output s3_bucket | tr -d '""')" >> "$GITHUB_OUTPUT"
                echo "cfid=$(terraform output cloudFront_ID | tr -d '""')" >> "$GITHUB_OUTPUT"
                echo "domain=$(terraform output cloudFront_domain_name | tr -d '""')" >> "$GITHUB_OUTPUT"        


Successful completion of infra_job will create the following resources on the HCP terraform.

Article content

The build_job is the easiest of all. It uses Hugo setup actions to install and build our website. The build Artifacts are uploaded using the Upload a Build Artifact GitHub action.

    build_job:
        name: Build
        needs: [Infra_job]
        runs-on: ubuntu-latest

        steps:
            - name: Checkout Repository
              uses: actions/checkout@v4

            - name: Setup Hugo
              uses: peaceiris/actions-hugo@v3
              with:
                hugo-version: '0.135.0'
                extended: true

            - name: Build
              run: hugo

            - name: Upload Build Artifact
              uses: actions/upload-artifact@v4
              with:
                name: tech-blog
                path: public/*
        

The final job is to publish the website by deploying the Artifacts generated in the build_job to Amazon S3. Here, we use OIDC integration between GitHub and AWS for the GitHub runner to leverage AWS CLI for running s3 sync and cache-invalidation commands.

deploy_job:
        name: Publish
        needs: [build_job, Infra_job]
        env:
          S3_BUCKET: ${{needs.Infra_job.outputs.s3_bucket}}
          DISTRIBUTION_ID: ${{needs.Infra_job.outputs.cf_id}}
        runs-on: ubuntu-latest
        permissions:
          id-token: write     # This is required for requesting the JWT
          contents: read      # This is required for actions/checkout

        steps:
            - name: Download Build Artifacts
              uses: actions/download-artifact@v4
              with:
                name: tech-blog

            - name: Configure AWS credentials
              uses: aws-actions/configure-aws-credentials@v4
              with:
                role-to-assume: ${{secrets.AWS_IAM_ROLE}}
                aws-region: ${{ env.AWS_REGION }}

            - name: S3 sync
              run: | 
                aws s3 sync . s3://${{env.S3_BUCKET}} \
                --delete

            - name: Create CloudFront Invalidation
              run: |
                aws cloudfront create-invalidation \
                --distribution-id ${{env.DISTRIBUTION_ID}} \
                --paths "/*"        


The workflow summary page shows the successful completion of all jobs and the generated Artifact.

Article content
Workflow Summary

Hence, all future releases are automated using GitHub actions CI/CD workflow. This setup can be further improved by creating a feature branch and testing changes before merging to the main branch.



To view or add a comment, sign in

More articles by Anoop Jayadharan

  • 🚦 BIG-IP APM API protection with Postman as OAuth Client

    This article describes how to configure an API protection proxy service on the BIG-IP APM as an OAuth resource server…

    1 Comment
  • A Website or Blog-page for you

    Here is how I developed and hosted my blog page on GitHub Pages for free. Prerequisites Install Git on your local…

  • Kerberos SSO constrained delegation with BIG-IP APM

    The primary purpose of Kerberos Single Sign-On is to provide seamless authentication to web or application servers once…

  • Connect to your Amazon EC2 instance using Session Manager

    You might encounter problems when you connect to the EC2 instance through the session manager for the first time…

  • Database Migration

    If you need some background, visit my previous post, Launching MVP. The following diagram depicts the v1.

    6 Comments
  • CloudTalents Application on K8s🔥

    After building the docker image in the previous article, it's time to orchestrate containers using K8s. Follow along by…

    6 Comments
  • Dockerizing Cloudtalents Startup App

    All right, here you go; this is the high-level overview of the application. It is written in Python and uses the Django…

    1 Comment
  • Launching MVP

    The diagram depicts two CI/CD workflows. One builds the AMI using Packer, and the other deploys an EC2 from the custom…

    6 Comments
  • Building the Connectivity

    In one of the posts, I talk about setting up an AWS landing zone using the control tower. Followed these steps from the…

  • OIDC Integration between GitHub and AWS

    Does your GitHub Actions CI/CD pipeline have hard-coded, long-lived cloud-provider credentials for communicating with…

Explore content categories