Solutions Delivery Platform

Section 2.2 - Creating the Pipeline Configuration Repository

There are as many ways to create a set up a Pipeline Configuration Repository as there are ways to create software. This guide should provide a good foundation to build off of.

This page covers the creation of a simple pipeline configuration repository, a repository containing files that define one or more pipelines, with a single governance tier in the root. You can learn more about pipeline goverance here.

Create a Repository

Use an SCM like GitHub to create a repository to store your pipeline configuration. This way you can access, share, and maintain pipeline configuration like any other piece of code.

Create a Pipeline Template

At the root of this repository create a file called Jenkinsfile. This is the default pipeline template. A simple pipeline template might look like this:

static_code_analysis()
build()

on_merge to: develop, {
  deploy_to dev
}

on_merge to: master, {
  deploy_to prod
}

Create pipeline_config.groovy

Also at the root of your pipeline configuration repository, create a file called pipeline_config.groovy. Since this file will (likely) be in your global governance tier, you modify it to make changes that apply to every pipeline. A simple pipeline config file might look like this:

application_environments{
  dev{
    short_name = "dev"
    long_name = "Develop"
  }
  prod{
    short_name = "prod"
    long_name = "prod"
  }
}

keywords{
  master = /^[Mm]aster$/
  develop = /^[Dd]evelop$/
}

libraries{
  github_enterprise
  sonarqube
  docker{
    registry = "docker-registry.default.svc:5000"
    cred = "openshift-docker-registry"
    repo_path_prefix = "my-app-images"
  }
  sdp{
    images{
      registry = "https://docker-registry.default.svc:5000"
      repo = "sdp"
      cred = "openshift-docker-registry"
    }
  }
  openshift{
    // More on these settings in the next section
    url = "https://my-openshift-cluster.ocp.example.com:8443"
    helm_configuration_repository = "https://github.com/kottoson-bah/sdp-example-helm-config.git"
    helm_configuration_repository_credential = github
    tiller_namespace = my-app-tiller
    tiller_credential = my-app-tiller-credential
  }
}

The pipeline defined by this example won’t work until you’ve finished setting up your application environments in OpenShift and written your helm configuration repository, which is covered in the next section.

About The Example Pipeline Configuration

This section explains how the example pipeline template and pipeline config file above work together to create a pipeline.

The Pipeline Template

Starting with the pipeline template, every pipeline created from this template will have these steps:

static_code_analysis() // 1) Check the source code for bugs & code smells
build()                // 2) Build an artifact from the source code

on_merge to: develop, {// 3a) if a merge to the develop branch triggered the build...
  deploy_to dev        // 3b) deploy the application to the "dev" environment
}

on_merge to: master, {// 4a) if a merge to the master branch triggered the build...
  deploy_to prod      // 4b) deploy the application to the "prod" environment
}

Now that the pipeline template has defined what the pipeline does, there needs libraries to be a pipeline config file to define how. It needs to provide the implementation for the pipeline steps, application environments to define the dev and prod environments being deployed to, and keywords for the variables develop and master being used.

The Libraries

libraries{
  github_enterprise
  sonarqube
  docker{
    registry = "docker-registry.default.svc:5000"
    cred = "openshift-docker-registry"
    repo_path_prefix = "my-app-images"
  }
  sdp{
    images{
      registry = "https://docker-registry.default.svc:5000"
      repo = "sdp"
      cred = "openshift-docker-registry"
    }
  }
  openshift{
    // More on these settings in the next section
    url = "https://my-openshift-cluster.ocp.example.com:8443"
    helm_configuration_repository = "https://github.com/kottoson-bah/sdp-example-helm-config.git"
    helm_configuration_repository_credential = github
    tiller_namespace = my-app-tiller
    tiller_credential = my-app-tiller-credential
  }
}

For every step used in a pipeline template, something needs to define that step’s implementation. For the JTE, these step implementations most commonly come from "libraries", which are imported from a "library source". For this example pipeline, it’s assumed that the sdp-libraries library source is available, and any of the libraries it contains can be used.

Five libraries are being imported here: github_enterprise, sonarqube, docker, sdp, and OpenShift. Below is a mapping of steps to the libraries that are being used.

static_code_analysis() // sonarqube
build()                // docker

on_merge to: develop, {// github_enterprise
  deploy_to dev        // openshift
}

on_merge to: master, {// github_enterprise
  deploy_to prod      // openshift
}

Although the sdp library doesn’t provide the implementation for any of the steps here, it’s being imported because both the SonarQube and OpenShift libraries depend on a step it defines.

The Application Environments

application_environments{
  dev{
    short_name = "dev"
    long_name = "Develop"
  }
  prod{
    short_name = "prod"
    long_name = "prod"
  }
}

The OpenShift library uses Application Environment primitives to select which project in OpenShift to deploy to. For example, when the pipeline template calls deploy to: dev(which can also be read as deploy(to: dev)), it takes the dev application environment primitive object that we define here and uses its values in the deploy_to step. The short_name, in particular, is used to select the target OpenShift project and which values.yaml file to use as part of the deployment. View the next section or the OpenShift library page for more details.

The Keywords

keywords{
  master = /^[Mm]aster$/
  develop = /^[Dd]evelop$/
}

The GitHub Enterprise library uses Keyword primitives to determine what kind of GitHub branch is being built. The steps on_merge(), on_commit, and on_pull_request take a regex expression as a parameter. These regex expressions have been stored as keywords to make the pipeline template more human-readable.

Closing Summary

This pipeline configuration repository, with a single governance tier located in the base of the repository, contains two files: Jenkinsfile and pipeline_config.groovy. The default pipeline template, Jenkinsfile, defines the steps that each pipeline executes. The pipeline configuration file, pipeline_config.groovy, controls how those steps are run in the pipeline by selecting the libraries to implement those steps, the settings for those libraries, and any other pipeline primitives being used.

Using the files in this example, pipelines will:

  1. test the source code using SonarQube

  2. build & push a Docker container image

  3. depending on the pipeline trigger, deploy that container on OpenShift

Next Steps