Automation in action
Been a while since I last posted something here, thought I’d dive in again with a post on automating infrastructure (with Infrastructure as Code) in action - Rob.
Recently in my spare time, I’ve started to get back into automation a bit. About a year ago I had the pleasure of working on an automation solution to build immutable infrastructure for a test\production environment. The solution utilised Ansible, Terraform and PowerShell tied together with Jenkins, Bitbucket and Artifactory to build a Windows\RedHat\SQL environment on top of VMware (with the parameters for the build coming from a pretty cool Rest\React\Prostgres application that I wasn’t as involved in).
The reason for this context above, is I’ve been spending the last year and a half heavily involved in building\supporting an Azure environment and thought for fun I’d see if I could use AzureCLI to do the same thing. Turns out you can.
For some context, here’s the environment I use for this:
Hopefully the diagram above gives an idea of what’s going on - it’s by no means a complete picture (I’ll provide a more ‘complete’ on later on), but it’s a place to start with at least. The ‘glue’ that I use to tie everything together is GitHub. I first create a repository and clone it locally. Once I have this, I use VS Code to create the folder structure and files I want to push the repository later on. The second piece of the puzzle is Docker.
I create a container using a docker file using the below command:
docker build <path to docker file> -t jenkins.azurecli:latest
The docker file I used is below - this is a container that has Jenkins (and the Blue Ocean plugin) in it and then add the Azure CLI to it (small note - I may be doing this backwards; instead of using Jenkin’s as my base I could instead by using the Azure CLI container and adding Jenkins to that - let me know if that’s the case):
FROM jenkins/jenkins:jdk11 USER root RUN apt-get update && apt-get install -y ca-certificates curl apt-transport-https lsb-release gnupg RUN curl -sL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor | tee /etc/apt/trusted.gpg.d/microsoft.gpg > /dev/null && AZ_REPO=$(lsb_release -cs) && echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ $AZ_REPO main" | tee /etc/apt/sources.list.d/azure-cli.list && apt-get update && apt-get install -y azure-cli USER jenkins
RUN jenkins-plugin-cli --plugins blueocean:1.24.3
Once I have my container, I then start it and mount a volume like so -
docker run --name jenkins -p 8080:8080 -v jenkins_home:<path to mount> jenkins.azurecli:latest
Now I have my base Jenkins environment. I’m going to skip the steps in setting up Jenkins as that isn’t really the point of this article; I guess I can cover that separately if someone wants it.
Now that I have Jenkins & GitHub setup (and jointed together), the development process goes a little like this:
- Develop Azure CLI commands, using variables as much as possible and as isolated (small stand alone commands you can chain later on) as possible to allow reuse (what I’ll actually execute with Jenkins later on):
az group create --name ${params.RGNAME} --location ${params.LOCATION} --tags ${params.TAGS} --output ${params.GLOBALOUTPUT}
- Create Pipeline-as-Code for Jenkins (using Groovy):
- Push the code to GitHub.
- Create a Jenkins pipeline using the above code
- Run once to populate the parameters (note - if someone has a way of skipping this let me know, otherwise you have to manually add the parameters, which can take a while depending on how many there are...)
- Run and watch the magic :)
By repeating the above steps and doing this for each component you end up with a list of little pipelines that you can then chain into bigger ones. You’ll see in the below screenshots a ‘POP’ job - this is chaining together the other jobs in the screenshot (specifically 'Create Resource Group' and 'Create Virtual Network') to create just the one job - this is the basic idea of infrastructure as code; create small little parts that work by themselves and then chain them together to create bigger pieces of ever more complex infrastructure.
To close this post out, I thought I’d share one more diagram - as I said, there are improvements that could be made to my environment. The first one would be to use a Jenkins Agent against the Azure CLI - this is how you're meant to use Jenkins, one container to run the GUI, many to run the agents doing the work so it scales. The second one (which I mentioned at the start) would be to store all the parameters in a rest endpoint - this would save me having to enter them in manually in Jenkins and would be my CMDB. Finally, the last part would be setting up Jenkins to automatically run the jobs when I push changes to GitHub.
So, that’s what I’ve been doing recently for ‘fun’. A nice little build environment for practicing and working with Infrastructure as Code.