Create a CI/CD pipeline with GitHub Actions

Reading Time: 7 minutes

A CI/CD pipeline helps in automating your software delivery process. What the pipeline does is building code, running tests, and deploying a newer version of the application.

Not long ago GitHub announced GitHub Actions. Meaning that they have built it system for support for CI/CD. This means that developers can use GitHub Actions to create a CI/CD pipeline.

With Actions, GitHub now allows developers not just to host the code on the platform, but also to run it.

Let’s create a CI/CD pipeline using GitHub Actions, the pipeline will deploy a spring boot app to AWS Elastic Beanstalk.

First of all, let’s find a project

For this purpose, I will be using this project which I have forked:
When forked we need to open the project. Upon opening, we will see the section for GitHub Actions.

GitHub Actions Tool

Add predefined Java with Maven Workflow

Get started with GitHub Actions

By clicking on Actions we are provided with a set of predefined workflows. Since our project is Maven based we will be using the Java with Maven workflow.

By clicking “Start commit” GitHub Will add a commit with the workflow, the commit can be found here.

Let’s take a look at the predefined workflow:

name: Java CI with Maven

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v2
    - name: Set up JDK 1.8
      uses: actions/setup-java@v1
      with:
        java-version: 1.8
    - name: Build with Maven
      run: mvn -B package --file pom.xml

name: Java CI with Maven
This is just specifying the name for the workflow

on: push,pull_request
On command is used for specifying the events that will trigger the workflow. In this case, those events are push and pull_request on the master branch in this case

job:
A job is a set of steps that execute the same runner

runs-on: ubuntu-latest
The runs-on is specifying the underlaying OS we want for our workflow to run on for which we are using the latest version of ubuntu

steps:
A step is an individual task that can run commands (known as actions). Each step in a job executes on the same runner, allowing the actions in that job to share data with each other

actions:
Actions are the smallest portable building block of a workflow which are combined into steps to create a job. We can create our own actions, or use actions created by the GitHub community

Our steps actually setup Java and execute Maven commands needed for the build of the project.

Since we added the workflow by creating commit from the GUI the pipeline has automatically started and verified the commit – which we can see on the following image:

Pipeline report

Create an application in AWS Elastic Beanstalk

The next thing that we need to do is to create an app on Elastic Beanstalk where our application is going to be deployed. For that purpose, an AWS account is needed.

AWS Elastic Beanstalk service

Upon opening the Elastic Beanstalk service we need to choose the application name:

Application name

For the platform choose Java8.

Choose platform

For the application code, choose Sample application and click Create application.
Elastic Beanstalk will create and initialize an environment with a sample application.

Let’s continue working on our pipeline

We are going to use an action created from the GitHub community for deploying an application on Elastic Beanstalk. The action is einaregilsson/beanstalk-deploy.
This action requires additional configuration which is added using the keyboard with:

    - name: Deploy to EB
      uses: einaregilsson/beanstalk-deploy@v13
      with:
        aws_access_key: ${{ secrets.AWS_ACCESS_KEY_ID }}
        aws_secret_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        application_name: {change this with aws application name}
        environment_name: {change this with aws environment name}
        version_label: ${{github.SHA}}
        region: {change this with aws region}
        deployment_package: target/spring-petclinic-rest-2.2.5.jar

Add variables

We need to add value into the properties AWS Elastic Beanstalk application_name, environment_name AWS region and, AWS APIkey.

Go to AWS Elastic Beanstalk and copy the previously created Environment name and Application name.
Go to AWS Iam under your user in the security credentials section either create a new AWS access Key or use an existing one.
The AWS Access Key and AWS Secret access key should be added into the GitHub project settings under the secrets tab which looks like this:

GitHub Project Secrets

The complete pipeline should look like this:

name: Java CI with Maven

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v2
    - name: Set up JDK 1.8
      uses: actions/setup-java@v1
      with:
        java-version: 1.8
    - name: Build
      run: mvn -B package --file pom.xml
    - name: Deploy to EB
      uses: einaregilsson/beanstalk-deploy@v13
      with:
        aws_access_key: ${{ secrets.AWS_ACCESS_KEY_ID }}
        aws_secret_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        application_name: pet-clinic
        environment_name: PetClinic-env
        version_label: ${{github.SHA}}
        region: us-east-1
        deployment_package: target/spring-petclinic-rest-2.2.5.jar

Lastly, let’s modify the existing app

The deployed application in order to be considered a healthy instance from Elastic Beanstalk has to return an ok response when accessed from the Load Balancer which is standing upfront the Elastic Beanstalk. The load balancer is accessing the application on the root path. The forked application when accessed on the root path is forwarding the request towards swagger-ui.html. For that purpose, we need to remove the forwarding.

Change RootController.class:

@RequestMapping(value = "/", method = RequestMethod.GET, produces = "application/json")
    public ResponseEntity<Void> getRoot() {
        return new ResponseEntity<>(HttpStatus.OK);
    }

Change application.properties server port to 5000 since, by default, Spring Boot applications will listen on port 8080. Elastic Beanstalk assumes that the application will listen on port 5000.

server.port=5000

And remove the server.servlet.context-path=/petclinic/.

The successful commit which deployed our app on AWS Elastic Beanstalk can be seen here:

Pipeline build

And the Elastic Beanstalk with a green environment:

Elastic Beanstalk green environment

Voila, there we have it a CI/CD pipeline with GitHub Actions and deployment on AWS Elastic Beanstalk. You can find the forked project here.

Pet Clinic Swagger UI

Project story: Automate AEM deployments for a Swiss bank

Reading Time: 5 minutes

A large bank in St. Gallen, Switzerland had the need to improve the AEM deployment process for its various staging environments. It was one of my first projects for N47 and was settled to run for 6 months starting in October 2018. The following blog gives a short project overview.

Getting started

Starting to work for a new customer is always exciting to me because every company and team has a unique mindset and culture. Usually, it takes a few days or weeks to get to know the new teammates. But this time it was completely different, as I already worked with each of the three team members and their supervisor together in one before my previous company. It was nice to meet old colleagues and we had a very good start.

Deployment process before automation.
Source: www.dreamstime.com

Technology Stack

The technology stack was already defined and the servers ordered. But it took a while until the infrastructure was ready and for the time being, I worked on my local machine.

Jenkins was set as the central tool for build orchestration, deployments, and various DevOps tasks. All the pipeline source code is stored in GitLab and the main business application we’re dealing with is Adobe Experience Manager (AEM).

A relatively large amount of work was needed for the initial setup like enabling connectivity to the relevant systems, basic shared library, and getting to know the internal processes. Read more about Jenkins behind a corporate proxy as an example for this setup: https://www.north-47.com/knowledge-base/update-jenkins-plugins-behind-a-corporate-proxy/

Implemented Pipelines

The bank has two different AEM projects: one for the corporate website and another for their intranet. They require a slightly different deployment pipeline and both have three environments: development, staging, and production.

Besides the deployment pipelines, there are pipelines for copying content from the production to the development environment and restoring a complete production environment into the staging environment in order to have an exact copy and a good baseline for approvals.

Many auxiliary jobs like start/stop AEM + Dispatcher, checking the health of instances, fetch last backup time, and execute Groovy scripts are used in the deployment pipelines as well in an independent executable job.

An example of a Jenkins Pipeline
Source: https://jenkins.io
An example of a Gitlab Pipeline
Source: https://docs.gitlab.com/ee/ci/pipelines.html

Advantages of automation

The automation of the various processes brought faster deployments. But more important transparency and centralized logs about what exactly happened and higher quality as repetitive tasks are always executed.

One example is the backup check, which needed coordination and forced to long waiting times. Now an API is used and the automation pipeline has instant feedback about the last backup time and shows a note if a backup is missing. Before, such a step might have been skipped in order to save some time.

With each built pipeline, some more little and reusable helpers were introduced which made it then again easier and faster to create the next pipelines. Think of a construction kit.

Deployment process with automation.
Source: www.wikimedia.org

Project finished ➞ client is very happy

After several months of close collaboration, more and more pipelines have been implemented and are used to support the crucial deployment processes countless times.

I enjoyed building-up the AEM automation and believe it’s a very good aid for higher quality and further extensions.

After a warm welcome and six months of working together, it was time to say good-bye as the project had a fixed time span. The client’s team was very kind and gave me even some great presents to remember the exciting time in St. Gallen.

Present from client: Swiss beer, chocolate, bratwurst and biber