How to share a Declarative Pipeline across many projects

前端 未结 3 1682
别跟我提以往
别跟我提以往 2021-02-10 09:57

I have a lot of projects in different repositories which share the same fundamental CI-workflow, which I can easily express as a Declarative Pipeline:

pipeline {         


        
相关标签:
3条回答
  • 2021-02-10 10:11

    I have been dealing this same issue for my own work. The best solution I could come up with was include a generic Jenkinsfile in every project/repo in my organization:

    node
    {
      checkout([$class: 'GitSCM', branches: [[name: env.DELIVERY_PIPELINE_BRANCH]], userRemoteConfigs: [[credentialsId: env.DELIVERY_PIPELINE_CREDENTIALS, url: env.DELIVERY_PIPELINE_URL]]])
      stash includes: '*.groovy', name: 'assets', useDefaultExcludes: false
      load './Jenkinsfile.groovy'
    }
    

    I used environment variables in case things need to change, probably could be even more dynamic than my example current (this is all still in development anyway).

    Then stash is used to hold the rest of the groovy scripts used later and unstash them in the declarative pipeline.

    Finally it loads the Declarative Pipeline. Doesn't mess with the views, basically all behaves normal.

    So it's not exactly what you were looking for, and I'd rather have the ability to just pull from SCM in the first place. But hey, it's been working well enough for me for the time being.

    0 讨论(0)
  • 2021-02-10 10:12

    I am able to use a Shared Library to define a Declarative pipeline that is configurable via a YAML file.

    In my repo/project I define a Jenkinsfile to call the Shared Library:

    @Library('my-shared-library')_
    
    pipelineDefault(); // cannot be named 'pipeline'
    

    and a Jenkinsfile.yaml to configure the build parameters:

    project_name: my_project
    debug: true
    # you get the idea
    

    Then in my vars/pipelineDefault.groovy file a very simple Shared Library could look like this:

    def call() {
      Map pipelineConfig = readYaml(file: "${WORKSPACE}/Jenkinsfile.yaml }")
      node {
        stage('Build'){
          println "Building: ${pipelineConfig.project_name}"
        }
      }
    }  
    

    Of course this is a very simplified example, but the dynamic configuration DOES work.

    NOTE: this requires the pipeline-utility-steps plugin to read the YAML file

    0 讨论(0)
  • 2021-02-10 10:30

    While the views are left intact using the suggestion from noober01 the declarative pipeline will not function properly. E.g. when clauses will be ignored, since the pipeline element is expected to be top-level, meaning it is parsed as scripted pipeline instead.

    See the following issue rejected by the team behind Jenkins: loading external declarative pipelines issue

    0 讨论(0)
提交回复
热议问题