Pass Artifact or String to upstream job in Jenkins Pipeline

☆樱花仙子☆ 提交于 2020-12-13 20:57:07

问题


Goal

I'm trying to orchestrate a dependency chain using the GitHub organization plugin along with the jenkins pipeline.

As the products I'm building have a number of shared dependencies, I'm using nuget packages to manage dependency versioning and updates.

However, I'm having trouble getting necessary artifacts/info to the projects doing the orchestration.

Strategy

On a SCM change any upstream shared libraries should build a nuget package and orchestrate any downstream builds that need new references:

  1. I am hardcoding the downstream orchestration in each upstream project. So if A is built, B and C with dependencies on A will be built with the latest artifact from A. After that, D with dependencies on B and C, and E with dependencies on A and C will be built with the latest artifacts from A, B, C as needed. And so on. These will all be triggered from the Jenkinsfile of A in stages as dependencies are built using the "Build Job: Jobname" syntax. I couldn't find a solution by which I could just pass the orchestration downstream at each step as the dependencies diverge and converge downstream and I don't want to trigger multiple builds of the same downstream project with different references to upstream projects.
  2. I can pass the artifact information for the parent project down to any downstream jobs, but the problem I'm facing is that, the parent project doesn't have any assembly versioning information for downstream artifacts (needed to orchestrate jobs further downstream). Stash/Unstash doesn't seem to have any cross-job functionality and archive/unarchive has been deprecated.

TLDR: I need a method of either passing a string or text file upstream to a job mid-execution (from multiple downstream jobs) OR I need a method for multiple dowstream jobs with shared downstream dependencies to coordinate and jointly pass information to a downstream job (triggering it only once).

Thanks!


回答1:


This article can be useful for you - https://www.cloudbees.com/blog/using-workflow-deliver-multi-componentapp-pipeline

sometimes Artifact way is needed. upstream job:

void runStaging(String VERSION) {
    stagingJob = build job: 'staging-start', parameters: [
        string(name: 'VERSION', value: VERSION),
    ]
    step ([$class: 'CopyArtifact',
        projectName: 'staging-start',
        filter: 'IP',
        selector: [$class: 'SpecificBuildSelector',
            buildNumber: stagingJob.id
        ]
    ]);
    IP = sh(returnStdout: true, script: "cat IP").trim()
    ...
}

downstream job

sh 'echo 10.10.0.101 > IP'
archiveArtifacts 'IP'



回答2:


I ended up using the built-in "archive" step (see syntax in pipeline syntax) in combination with copyArtifact plugin (must use Java style step with class name).

I would prefer to be able to merge the workflow rather than having to orchestrate the downstream builds in each build with anything to build downstream, but haven't been able to find any solutions to that end thus far.




回答3:


You could use the buildVariables of the build result.

Main job - configuration: pipeline job

node {
    x = build job: 'test1', quietPeriod: 2
    echo "$x.buildVariables.value1fromx"
}

test1 - configuration: pipeline job

node {
    env.value1fromx = "bull"
    env.value2fromx = "bear"
}


来源:https://stackoverflow.com/questions/39730193/pass-artifact-or-string-to-upstream-job-in-jenkins-pipeline

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!