Pages

Sunday, July 26, 2015

Jenkins DSL scripting - Part 4 - adding our own library to the DSL plugin /TL;DR/




Intro

This is actually the last part of my series about Jenkins DSL scripting. I already discussed the basics with environment setup, job creation and linking, view definition and the importance of the easily reproducible build pipelines with non-pet Jenkins servers. The following part I explained some more advanced topics, like the interaction with the environment, credentials and the customization options with configure block

In this part I'm talking about how to extend the plugin's functionality with your own commands and implement complex logic behind it.

I'd like to say a REALLY big thank you to my colleague Thomas Schneider for his approval to reuse his code snippets and findings in this blogpost. Respect!

Business logic within the pipeline

If you want to add some business logic to your pipeline, you could scratch your head how to implement it with the existing DSL tooling. They'are not really designed to dynamically behave based on events during the pipeline run, except some Jenkins plugins with their predefined functionality.

As you know the bootstrap/seed job is only responsible to the creation of pipeline structure, but not participating in any application lifecycle steps, so can't be used for this task.

The Configure block is a great tool to extend your build config with some direct XML injections and can add more flexibility with the usage of non-DSL-covered Jenkins plugins. Unfortunately it fails when you need a kind of function that is not implemented in any of the plugins.

Should you implement the required functionality in OS shell scripting with intensive environment variable exchanges? That could be a bit challenging task to keep everything synchronized between the OS's and Jenkins' context, so we need a more simpler, but efficient solution.

Extending the contexts with reusable functions

The DSL plugin defines contexts for their closures and you could use Groovy's metaClass to extend them with your own code. 
The format is: .metaClass.desired closure_entry_name= { }
You could extend any of the context and a quick search on the Github repo is showing how much opportunities there to enrich some original functionality.

At my company we have a dedicated repo for these "library" functions, the bootstrap/seed job is checking out them into a subfolder, then executes them before the project specific files to make them available. It takes few extra seconds when you generate the pipeline code, but it has no significant performance impact.

A simple example: echo

The simplest and most famous example is the Hello World implementation on any languages. As a starting point let's create an echo command with a String and Long parameter to demonstrate the argument handling too.


In the example we are extending the StepContext to add the echo command and the implementation is simply reusing the existing shell command.

How to organize our library

I suggest to create a separated repo and reuse it in all your pipeline jobs. One way achieve this you need the Multiple SCM plugin or you could play with the Git submodules. In the examples I'm going with the plugin because that more visual for me :)
With strictly followed open-closed principle and well designed interfacing, you shouldn't get compatibility issues during the pipeline generations. The folder structure could follow the regular Maven directory structure like this: src/main/groovy, src/test/groovy, etc.

In this screenshot you could see how I organized the repo clonings. Not visible, but I set the bootstrap/seed job's sub-directory to bootstrap.



Add the library to Jenkins

Once you have a repository with working library code, you could add it to the bootstrap/seed job  and execute before the pipeline code runs. The DSL plugin configuration permits to specify wildcards in ant style.
For example if you check it out to the /lib folder, you need to configure the DSL plugin to run files from the filesystem by the following pattern:
 "lib/src/main/**/*.groovy"


Now you added the library repo to clone, executed the code there and the final step is to use the newly implemented function in your pipeline code:



The result should be a newly created job testJob with our nice and shiny echo command:


Advanced example: Docker plugin with configure block

Now you know how to implement a simple library routines, but not everything can be implemented with the existing instruction set of DSL plugin. For example in the very moment of writing there's no implementation for CloudBees' Docker build and publish plugin, so let's create one for our own usage.

The problem

The plugin is not available (yet) in the DSL closure context, so we need to implement our own solution to generate the configuration block in config.xml as a new step. First we need to know how to configure the plugin and what parameters can be set for it. Then we just need NodeBuilder to create the necessary XML entries.

I created a fake entry to make all variables available in the config:



Let's see how Jenkins persisted that into the XML:



Excellent! We have everything to implement our own Docker publish extension for Jenkins:



Add this file as dockerBuild.groovy to our library repo and we can refactor our DockerImageStep with the new dockerBuild step:



The ugly configure block now replaced with a simple closure item and can be reused anywhere. Of course you could improve it with some default values and method overloading, but this is just a a demonstration :) Also you could add complex logic in Groovy to implement some business requirements on pipeline level.

Summary

Finally we are are reached the end of my series about Jenkins DSL scripting. We went through the whole process to auto generate your delivery pipeline and create reusable components or a complete framework for your projects. Now you could customize the closure elements and add your own business logic to these steps. A good delivery pipeline based on a good CI/CD platform and I'm believing in the Jenkins as a good candidate, but not only a sophisticated build configuration needed for a complete pipeline. In the next article I'm describing how Jenkins could support the Forking workflow for Git to make multiple releases per day possible. So please stay with me :)

Github links:

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.