Jenkins Certified Engineer Training

The training given by opensourcearchitect was very straight forward. It covered all of the topics expected along with a jaunt into CloudBees specific features.

The best note of this was the introduction to CloudBees Jenkins Operations Center (CJOC).

This provides wonderful management and automated features around:

  • Authentication
  • Authorization
  • Elastic Master Management
  • Plugin Management
  • Automated Job Spreading using Jenkins Job DSL

These are all things that were very painful with OSS Jenkins and it will be fun to see them come together on an enterprise level.

And as a final note from the 2 day training, I am now a Certified Jenkins Engineer. This allows for just another certification to my name and expertise to help other on my team.

JenkinsWorld

This was my first time attending JenkinsWorld, but not my first technical conference. The atmosphere was much of the same, and many sessions tended towards the marketing and work of teams.

This made choosing which sessions to attend and which not very difficult, but with some effort, I feel that I gained a new understanding for Jenkins and the Pipeline workflow. But most importantly, I learned that I have been looking at it the wrong way for many years.

Lets dig deeper into this!

Pipeline Unit Testing

Thursday morning, I had a very good conversation with Ozan Gunalp and Emmanuel Quincerot. Both are software engineers from LesFurets.com. But in the Jenkins world, they are known for their work on the JenkinsPipelineUnit project.

This project presents a series of entrypoints and mocking features around.

Using this library is as easy as the following:

import com.lesfurets.jenkins.unit.BasePipelineTest

class TestExample extends BasePipelineTest {

    @Test
    void baserun_without_errors() throws Exception {
        def jenkinsfile = loadScript("./jenkinsfile")
        jenkinsfile.execute()
        printCallStack()
    }
}

This provides a very nice comparable output, along with a junit test return for good display in Jenkins

jenkinsfile.run()
jenkinsfile.edpGlobal.runEdpProcess()
    ... Call stack based on full run

There is one downfall of using this library and that is its requirement on deep groovy reflection. This limits its use in deeply inherited and factory generated code.

With future refactorization of code, I hope to work toward fully implemented test cases for the full T-Mobile EDP Pipeline based on this library.

Using Jenkins Less

Jesse Glick gave a very good talk on how to "Use Jenkins Less". This was focused on the ideas of moving from traditional jenkins freestyle jobs to Jenkins Pipeline jobs and what was needed to obtain this on a long term usable model.

Jesse had a very good example of a code linting tool for Java that many projects have been using and are now in trouble due to the plugin being deprecated by the company owning the plugin and with a statement of not opensourcing the plugin to allow another developer to maintain it.

This lead to the thought that the usage of Jenkins plugins should be limited as much as possible to prevent any chance of your pipeline being unable to work post an update or between Jenkins masters.

The flow of this talk lead to the idea that freestyle jobs are best for maintainability because you can do everything via shell scripting. But with freestyle jobs not being ideal for usage in complex projects and not having a valid display in Blue Ocean, that pipeline jobs are the next best thing and considered only slightly more complex.

This provided a few strong points that all users should attempt to follow:

  • Keep the pipeline simple, it is built to be an orchestration layer not a build or automation script.
  • Put build logic in external scripts if possible, to avoid things like @NonCPS and method whitelisting problems.
  • Use shell steps over all else.

The first of these points is very easy to follow. A pipeline should be a high level orchestration and nothing more. When you get to forcing build customization and automation into your pipeline scripts, you have stepped beyond what a pipeline script should do. As part of this, all imported code should be concise and contained in a single standard groovy file that is importable with the pipeline LibraryStep.LoadedClasses object.

This ties into how the JenkinsPipelineUnit library works and how its reflection does not dig into deep inheritance.

The second point shows the idea of using many external scripts that are perferably stored in an SCM and loaded with the Jenkins "load" step. This allows you to load a single script and access it directly via a variable in a pipeline script.

The third point is the biggest of the suggestions to follow. Most people attempt to use as many Jenkins plugins as you can. This leads to many points that are not easy to debug inside of a pipeline.

As an example you can look at the Maven plugin and the pipeline withMaven wrapper.

The maven plugin loads multiple jars that make up the maven entrypoint and then pass parameters to it. There is no call to the mvn binary and debugging start up problems are very very hard.

When looking at the withMaven wrapper, you are able to see that unlike the maven plugin, the mvn binary is called, but the call is wrapped in multiple property and parameter setters that alter its base functionaility.

Because of a users lack of control over how these things work, the usage of them should be very sparing in order to maximize the reusability and stability of your pipeline.

Conclusion

For a very long time, I worked to use as many jenkins plugins as possible as well as working to create plugins to help with my workflow. Jesse's talk really opened my eyes to the right way to build a Jenkins pipeline and how to keep it usable for a forseable future.

While the talk on JenkinsPipelineUnit showed me that as I work to define clean pipelines, there is a way to provide strong unit testing as well as strong integration testing to make sure my pipelines will always work.

I am looking forward to working with my team to implement these ideas as well as making our pipelines first rate pieces of software that are easily usable by all developers inside of T-Mobile