This post is a continuation of the tutorial Setup Continuos Integration/Delivery system in just 4 steps with Jenkins Pipelines and Blue Ocean.
Now that you know how to setup your pipelines, you might want to go one step further and keep just one pipeline configuration for all your projects. That is possible using Shared Libraries. The idea is that your standard pipeline configuration resides in a shared repository that will be accessed for each of your projects. Each project will then only set specific properties into its own Jenkinsfile
.
Let’s see how to do it:
Shared Library Repository
1- Create a new git
repository where your shared library
will reside.
2- Create a vars
directory and a .groovy
file inside. Call this file standardPipeline.groovy
for example.
3- Add the following code example into this file.
def call(body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
node {
// Clean workspace before doing anything
deleteDir()
try {
stage ('Clone') {
checkout scm
}
stage ('Build') {
sh "echo 'building ${config.projectName} ...'"
}
stage ('Tests') {
parallel 'static': {
sh "echo 'shell scripts to run static tests...'"
},
'unit': {
sh "echo 'shell scripts to run unit tests...'"
},
'integration': {
sh "echo 'shell scripts to run integration tests...'"
}
}
stage ('Deploy') {
sh "echo 'deploying to server ${config.serverDomain}...'"
}
} catch (err) {
currentBuild.result = 'FAILED'
throw err
}
}
}
4- Commit and push
Jenkins configuration
1- Go to Manage Jenkins > Configure System > Global Pipeline Libraries
2- Give a name to your library and set the git url
Use library in your Project
1- Go to your project repository root and edit the content of your Jenkinsfile
with just the following
@Library("your-library-name") _
standardPipeline {
projectName = "Project1"
serverDomain = "Project1 Server Domain"
}
2- Note that the name of the library must be the same as you previously set in Jenkins.
That's it! You can now check in Blue Ocean how your projects are using the shared pipeline configuration
Review
Before we finish, let's see more in detail the code we used:
@library("your-library-name") _
Here is where you define which library to import. You can use as many libraries as you want into the same Jenkinsfile
. Do not forget the _
at the end, it is needed!
standardPipeline {
//...
}
The name of this block has to match with the name of your .groovy
library name. In this example that would be standardBuild
because our file was called standardPipeline.groovy
standardPipeline {
projectName = "Project1"
serverDomain = "Project1 Server Domain"
}
projectName
and serverDomain
are an example of how to set properties that can be accessed on the shared library:
def call(body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
//...
This code is responsible to load the properties previously mentioned. Thanks to that, properties can be accessed using config.propertyName
You can use properties to identify project settings or to execute different actions. For example:
if (config.propertyName == true) {
// Do something here
}
That's all. Now you can play around and configure your shared pipelines. We use 2 pipelines configurations:
-
projectBuild.groovy
for our projects -
commonModuleBuild.groovy
to build common modules that can be installed as project dependencies.
commonModuleBuild.groovy
is a bit different as it does not need a Deploy
step for example.
You can find the resources used for this tutorial here:
Top comments (24)
This is great, thank you. I've been using Jenkins for a while but never got very far into it past builds and tests, so now I'm going back over everything using Jenkinsfiles and deploy stages and this has been very helpful.
Thanks James, I am glad you found this useful. I will encourage you to try using Jenkinsfiles. Easy to setup and much more comfortable to maintain.
Hi Juan,
Thanks a lot for this post, really simple and straight forward.
I have a question related to this, as relying much on the shared library is an amazing way for having a very simple Jenkinsfile in each of my services which is cool, but that means any change in this shared library can break everything.
How to test the pipeline "shared library" before merging any change in my master branch that is called by every other service calling this pipeline, in another way how to test any changes before applying them to void breaking everything?
I tried to follow the JenkinsPipelineUnit on github but I really didn't get it well, can you explain it if you have done this before?
Thanks a lot
Hi Abdel,
What I usually do is to add my changes in a new brach of the shared library. Then I take one project and I modify its
jenkinsfile
to use that new branch. You can do that like that:After testing that everything works ok, I merge the library changes in master.
Would that work for you?
HI Juan,
Thanks again for your reply, but this is actually what I do in the mean time passing my own branch of the shared library repo, but seems to be very manual to us especially wish a really complex pipeline.
What I meant in my question is can we do any sort of unittest to the pipeline before merging to the shared library master branch
Things like cucumber, maven ...etc have you done any similar testing before for the sharedlibrary repo before as I can see your way is very manual testing and I wish if you had done any automated testing for the pipeline with the shared library.
Again Thanks a lot Juan
Abdel
Hi Abdel,
No, I do not do any automated tests for the pipeline code itself. If you find a convenient way to do it, I’d love to know more. I think there is something about testing the pipeline code in the following video. Here is the link in case it helps:
Hi Juan,
I have gone through this post and very much excited to implement it in part of my projects.
Everything works super... till one point.
Is it possible to make standardPipeline.groovy return some value(A collection Object) and capture that in the Jenkinsfile and pass it as an argument to another groovy script which is in a shared library?
/Prasanth
Hi Prasanth,
Sorry for my late reply. To be honest, I have not idea about that. That was not needed in my case. You can pass parameters from your Jenkinsfile to the shared library but not sure about the other way around. Maybe if elaborate your specific need, we find another way to do it without that.
Hi,
Sorry for my reply as well...
I was trying to do some modifications to an class object in a library function and explicitly returned the same object to use in the pipeline in the later stages and I could do it with defining the appropriate return type(though def works).
Now it works as expected.
Hi!
Do you have an idea how to reach the shared libraries directory structure if I want to keep it in specific directories in my repo?
Like if I put my vars and src folders not to the repo root, but in somewhere deeper:
/bar/foo
Hi,
Sorry, I have no idea whether that is possible. I did not see anything about that in the jenkins docs. Let us know if you figure out how to do it.
Hi Juan!
Had a question about what types of variables can be passed into the standardPipeline.
I've been able to pass strings and booleans but am having trouble passing in arrays. Do you know if this is possible?
Thanks,
Brian
Hi Brian,
I never tried passing arrays, so I do not know whether that is possible. Sorry that I cannot help with that.
Hi Juan,
I was trying to implement the same concept into my shared library activities to handle repetition of code.
I am using Jenkins v2.105 and I see some problems when I shifted my stage definition into a shared-library call as below:
Before Update:
ln#01 pipeline {
ln#02 stages {
ln#03 stage('1') {
ln#04 println "I am in stage 1"
ln#05 }
ln#06
ln#07 stage('2') {
ln#08 println "I am in stage 2"
ln#09 }
ln#10 }
ln#11 }
After Update:
ln#01 pipeline {
ln#02 stages {
ln#03 stage('1') {
ln#04 println "I am in stage 1"
ln#05 }
ln#06
ln#07 dynamicStage{
ln#08 param = '2'
ln#09 }
ln#10 }
ln#11 }
vars/dynamicStage.groovy
ln#01 def call(body) {
ln#02 def config = [:]
ln#03 body.resolveStrategy = Closure.DELEGATE_FIRST
ln#04 body.delegate = config
ln#05 body()
ln#06
ln#07 stage('Dynamic Stage') {
ln#08 println "I am in stage ${config.param}"
ln#09 }
ln#01
ln#11 }
Expected Behaviour:
I am in stage 1
I am in stage 2
Execution Result:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 07: Expected a stage @ line 07, column [ColumnNumber].
dynamicStage {
Is it something that the Jenkins v2.105 doesn't support this?
Any Luck on the shared library that you have implemented?
Hi Chris,
Where do you use "qools"? Inside the Jenkinsfile or in your library? Is it maybe the name of your library? If you could send me an example of the code where "qools" appears, I could help you more.
Did you also try the basic examples on this blog post?
Nice post!
There is a small error in the first example of pipeline.
Missing one " in:
Thanks ;)
Hi juan.I am new in declaritive and shared library.Can describe briefly shared library
Hi
im trying to run the sample from groovy sandbaox and got an error
java.lang.NoSuchMethodError: No such DSL method 'standardPipeline' found among steps
any idea here ?
Thanks
Hi David,
If you get this error, I would say that the library repo is working but it does not find the file
standardPipeline.groovy
. Can you re-check that your shared library is calledstandardPipeline.groovy
and it is insidevars
folder as in this example?I also updated the post because I noticed some formatting issues. You might also want to review the "Jenkins Configuration" section to ensure that is correct.