We have several Java projects. Each project has its own delivery pipeline.
All pipelines have the following steps in common (simplified):
You may use shared libraries which enables generating jobs, and makes code reusable in different jobs, and makes your jenkinsfiles really clean.
The shared library is formed from the folder src which will contain all the method that you are going to call in different jobs, and vars folder in which you are going to implement the logic of your Job, the files of vars folder are groovy files and you may call the logic behind each file by the name of this file in the jenkinsfile by passing the appropriate variables to your jobs, depending on whatever you have like as you mentionned (IP address, ...)
For example:
In vars you may have a deploy.groovy file in which you call a method from src folder that does the deploy thing and to which you'll pass parameters that are proper to a specific job and that you are going to define in your jenkinsfile when you call your deploy.groovy like this:
node{
deploy([ip_address : '...',
env : ''
])
}
And don't forget to import your shared lib. And configure the repo that contains this shared lib in your jenkins. For more details see the documentation for shared libraries in Jenkins
An approach that works well for us is to put parts of the pipeline (those that all projects have in common) or even the whole pipeline into a Jenkins shared library.
Example
The following script (template.groovy
) is defined as global variable in a Jenkins shared library. The method creates a new declarative pipeline (it also works for scripted pipeline syntax). All project specific properties are provided via the templateParams
map.
/**
* Defines a pipeline template (as a sample with one job parameter
* that should be common for all pipelines)
*/
def createMyStandardDeclarativePipeline(Map templateParams) {
pipeline {
agent any
parameters {
string(name: 'myInput', description: 'Some pipeline parameters')
}
stages {
stage('Stage one') {
steps {
script {
echo "Parameter from template creation: " + templateParams.someParam
}
}
}
stage('Stage two') {
steps {
script {
echo "Job input parameter: " + params.myInput
}
}
}
}
}
}
Using this global variable, the following line creates a pipeline from our template:
template.createMyStandardDeclarativePipeline(someParam: 'myParam')
Conclusion
This concept makes it easy to define pipeline templates and reuse them in several projects.
Applied on the example given in the question, you can create a delivery pipeline for a project with a simple one-liner:
template.createStandardDeliveryPipeline(serviceName: 'myService',
testEnv: '192.168.99.104',
productionEnv: '192.168.99.105')
Update (30-09-2017): Declaring a pipeline block in a shared library is now officially supported with Declarative Pipelines version 1.2 . See: https://jenkins.io/doc/book/pipeline/shared-libraries/#defining-declarative-pipelines
Update (06-10-2017): An extended example can now be found here: https://jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/