Monday, March 31, 2014

The Jenkins Build and Delivery Pipeline plugins

Continuous Delivery (CD) is a design practice used in software development to automate and improve the process of software delivery (http://en.wikipedia.org/wiki/Continuous_delivery). Continuous delivery uses the notion of a deployment pipeline in order to validate code. At an abstract level, a deployment pipeline is an automated manifestation of your process for getting software from version control into the hands of your users (http://www.informit.com/articles/article.aspx?p=1621865&seqNum=2). How can such a deployment pipeline be implemented?

In this blog post I'll describe the setup of my environment and my first experience with the Jenkins/Hudson Build and Delivery Pipeline plugins. What it can do to help implement a deployment pipeline and what it won't do for you.


Deployment pipeline components

In order to build a deployment pipeline, first we need something to deploy, something to do the deployment on and something to execute the deployment. I will not introduce Maven, Jenkins, Glassfish, Netbeans since they are well known.

Setting up the basics (JDK 8 + Netbeans 8 + Glassfish 4)

I installed an Ubuntu 13.10 server, used the standard available Jenkins package to install Jenkins. Since JDK 8 has recently been released, I decided to try out the installation on JDK 8. I also installed Glassfish Server 4.0 and Netbeans 8.0.

Java 8
First I installed Java 8. I could have done something like described on: http://fosshelp.blogspot.nl/2013/10/how-to-install-oracle-java-8-in-ubuntu.html but I did something like described on http://install-things.com/2013/04/16/how-to-install-oracle-java-7-update-21-on-ubuntu-12-10-linux/ (something about external repositories...).

Netbeans 8
I created a simple Maven project in Netbeans and used JAX-WS (https://jax-ws.java.net/) to create a simple HelloWorld webservice. My simple project consisted of one parent project and two child projects; a WAR project (the actual webservice) and an EAR project (J2EE application) which also contained a call to the glassfish-maven-plugin.


Glassfish 4
To make Netbeans deploy to my ('remote') Glassfish server (installed under a different user) I needed to enable secure admin: https://netbeans.org/bugzilla/show_bug.cgi?id=232739

./asadmin change-admin-password --domain_name domain1
./asadmin enable-secure-admin
./asadmin stop-domain domain1
./asadmin start-domain domain1

Also it saves time to have Glassfish start automatically when the server starts. There is a nice command available to generate the required scripts: sudo $GLASSFISH_HOME/bin/asadmin create-service.

NetBeans 8 + Glassfish 4 and the test project
Next I needed to download/install a local Glassfish server so Netbeans could find the libraries (I did not want Netbeans to be poking around directly in the server folders with a different user). My server configuration looked as followed:


In order to deploy to Glassfish, I needed some configuration of the glassfish-maven-plugin (https://maven-glassfish-plugin.java.net/). In order to execute the plugin as part of a Maven build phase, the phase tag needs to be specified.

The pom.xml for the EAR project contained the following;

       <plugin>  
         <groupId>org.glassfish.maven.plugin</groupId>  
         <artifactId>maven-glassfish-plugin</artifactId>  
         <version>${glassfish.plugin.version}</version>  
         <configuration>  
           <terse>true</terse>  
           <echo>true</echo>  
           <debug>true</debug>  
           <glassfishDirectory>${glassfish.glassfishDirectory}</glassfishDirectory>  
           <user>${glassfish.user}</user>  
           <adminPassword>${glassfish.adminPassword}</adminPassword>  
           <domain>  
             <name>${glassfish.domain.name}</name>  
             <host>${glassfish.domain.host}</host>  
             <adminPort>${glassfish.domain.adminPort}</adminPort>  
             <reuse>true</reuse>  
           </domain>  
           <components>  
             <component>  
               <name>${project.artifactId}</name>   
               <artifact>${project.build.directory}/HelloWorld-ear-${project.version}.ear</artifact>  
             </component>  
           </components>  
         </configuration>  
         <executions>  
           <execution>  
             <phase>install</phase>
             <goals>  
               <goal>redeploy</goal>  
             </goals>  
           </execution>  
         </executions>  
       </plugin>  

The properties like glassfish.domain.name were defined in the parent pom so they could be shared by different projects. The redeploy goal only works if the project is already deployed. The deploy option only works if the project is not deployed yet. It is not possible using the maven-glassfish-plugin to specify force (in order to make redeploy work even if the application has not been previously deployed) (https://java.net/projects/maven-glassfish-plugin/lists/users/archive/2010-08/message/6). The following specifies two possible solutions: http://stackoverflow.com/questions/14686973/maven-glassfish-plugin-how-to-specify-deploy-target; using the exec-maven-plugin and building your own maven-glassfish-plugin. There is of course also the option of using the Ant task (http://docs.oracle.com/cd/E19798-01/821-1752/beaep/) which does support the force option. I used a workaround: deployed the application for the first time manually (from Netbeans) and did redeploys after that from Jenkins.

A first experience with Git

As part of the abstract definition of a deployment pipeline (see introduction), version control is mentioned as a start point. There are several pieces of software available which help with version controlling your deliverables. In my experience working mostly with Java and other Oracle products, Subversion (http://subversion.apache.org/) is the most commonly used centralized version control system. Git (http://git-scm.com/) is gaining popularity as a distributed version control system. Since I didn't have much experience with Git, I decided to give it a try and describe my first experience with it here. I installed Git using the standard Ubuntu package.

Git is different then for example SVN (Subversion) in that every developer has it's own local repository in which he does most changes. He can push/merge his changes to an upstream repository and pull changes from that repository into his own.

First I went to my project root (/home/maarten/NetBeansProjects in my case) and did the following:

 git init  
 git add *  
 git commit  
 git status  

# On branch master
nothing to commit, working directory clean

So far so good.

I also added a .gitignore file to ignore the target directories
HelloWorld/HelloWorld-web/target
HelloWorld/HelloWorld-ear/target

I wanted a Git server installed for Jenkins to use. I did the following:

 sudo mkdir /opt/git  
 sudo chown git.git /opt/git
 sudo su git
 cd /opt/git  
 git clone /home/maarten/NetBeansProjects  

The NetBeansProjects directory which was created, seemed to contain the same content as the source so I thought that it was ok. I configured SSH keys (using https://help.github.com/articles/generating-ssh-keys and http://www.gilluminate.com/2013/04/04/ubuntu-ssh-agent-and-you/) so the Jenkins user could access the git directory as git user.

I found that the NetBeansProjects folder in the Jenkins workspace was empty. This was curious since files were present in the /opt/git repository which was cloned in the Jenkins workspace. Apparently I should have done a git clone --bare instead of a git clone to make this work.

Jenkins and the plugins

Jenkins used the same port as Glassfish: 8080. I moved Jenkins to port 8888 by changing the port in /etc/default/jenkins. After starting Jenkins and accessing the URL, the below message appeared.


I changed the JDK 8 in the same /etc/default/jenkins file to a Java 7 OpenJDK version and it worked.

In Jenkins I created a credential in which I defined the private key the jenkins user should use in order to authenticate himself to the git user. For git user I had added the public key of the jenkins user to his authorized_keys file.


Also I set-up simple global security without locking myself out.


Next I installed the Jenkins Git plugin (GIT plugin). I configured a default job to use the common Git repository and do a build.


I installed the Build Pipeline plugin (https://wiki.jenkins-ci.org/display/JENKINS/Build+Pipeline+Plugin) and the Delivery Pipeline plugin (https://wiki.jenkins-ci.org/display/JENKINS/Delivery+Pipeline+Plugin). Both plugins depend on the same mechanism: upstream/downstream relationships between projects. What are these relationships?


The relationships are defined in the jobs themselves by triggers and post-build steps. The Build Pipeline plugin provides a view over the jobs in which the relationships are visible.


In my example I created two jobs. One for building and one for deploying. The Build Pipeline plugin displays their relationship and allows execution of the pipelines.


The plugin allows to make certain steps triggered manually instead of automatically:


When looking at the screenshots/documentation of the Jenkins Delivery Pipeline plugin, this plugin is used on top of the Jenkins Build plugin. It allows jobs to be grouped in stages and assigned a task-name. These can be assigned in the job configuration.


This is presented when opening the view so progress per stage can be monitored in a business user friendly way.

Conclusion

The above was a quick tryout of two plugins. I spend little time on creating an extensive example. I have not created an entirely automated release process or used multiple environments for testing the setup in practice. Also I might have missed certain functionality.

The plugins should be used in conjunction with each other since manual tasks in a Delivery Pipeline plugin are created by using the Build Pipeline plugin. I was surprised to find that the pipelines themselves are mere views dependent on configuration at job level. I had hoped to find a more process driven way of defining a the relations between jobs instead of depending on the upstream/downstream relations between projects. Because the flow is fixed in the Jenkins job definitions, the plugins cannot be used to actually control the flow more then triggering fixed next steps.

If you have setup your Jenkins jobs the way the plugins want it, the views allow you to monitor progress and provide a more business user friendly view on releases. To accomplish this, my first impression (might be premature!) is that you would need jobs per environment instead of a single job to run for every environment which has the environment specific properties (e.g. env/user/pass). If you have a Jenkins environment already in use and you want to implement the plugins, you might have to do some rework.

The Pipeline plugins are most suitable for completely automated processes. A manual trigger can be implemented to pause at a certain step but it appears every step to be executed needs to be a Jenkins job. This might not be what you always want, for example to model a user acceptance test. The plugins are relatively new though (March 2013 for the Build pipeline plugin and October 2013 for the Delivery Pipeline Plugin) and the functionality will most likely be expanded. Also the plugins don't have much users yet. It might be wise to wait for wider adoption or look for alternatives.