问题
I do have a pipeline, p1, configured with 3 parameters, param1 as ChoiceParam, and param2 and param3 as runParameters, that retrieve different build-ids to pass them as versions.
I do also have annother pipeline, p2, from which i'd like to run p1. p2 does also have the same parameters configured, so, inside the dsl, i need to call p1 with params1,params2,params3 , as params should be inherited in p2
But i'm unable, as i've tried any single way i can think of. Can anyone help me ?
p2:
build() {
job('p1')
parameters([[$class:'StringParameterValue', name:'param1', value:${param1}],
[$class:'StringParameterValue', name:'param2', value:${param2}],
[$class:'StringParameterValue', name:'param3', value:${param3}]
])
}
Then, also tried
p2:
build(job:'p1', parameters: ([[$class:'StringParameterValue', name:'param1', value:${param1}],
[$class:'StringParameterValue', name:'param2', value:${param2}],
[$class:'StringParameterValue', name:'param3', value:${param3}]
]))
with no success. Any help, please ?
回答1:
Do you really need to define p1
as a full, separate job ? I don't think it's a good practice to chain pipelines jobs like we used to do with freestyle jobs.
Instead you should probably load another pipeline file containing your p1
tasks, and then just call p1
's defined function from p2
. Here is a good example from the doc.
Basically what you need to do is define your p1
as a pipeline file p1.groovy
:
def p1Actions(param1, param2, param3) {
// Do whatever p1 does with your 3 params
}
return this;
And then just call it from p2
:
pipeline = load 'p1.groovy'
pipeline.p1Actions()
And if you want p1 to be reusable from other jobs than p2
, just push it into its own SCM repo, and just add an SCM checkout to previous example, just before loading the script. By the way this step is also very well covered in the documentation.
来源:https://stackoverflow.com/questions/38618314/execute-jenkins-pipeline-from-inside-dsl