How to run continuous integration in parallel across multiple Pull Requests?

可紊 提交于 2019-12-23 12:23:18

问题


I am testing use of Jenkins with Github pull request builder plugin I have successfully set up a toy project on Github and dev installation of Jenkins so that raising a PR, or pushing changes to a PR branch triggers a build. Mostly this works as required - a few things don't match preferred workflow, but the freedom from having to write and maintain our own plugin is a big deal.

I have one potential showstopper. The plugin queues up all pushes in all PRs it sees, and only ever seems to run a single job at a time, even with spare executors available. In the real world project, we may have 10 active PRs, each may get a few pushed updates in a day in response to QC comments, and the full CI run takes > 30 mins. However, we do have enough build executors provisioned to run multiple jobs at the same time.

I cannot see any way to configure the PR request builder to process multiple jobs at once on the same trigger, but I may be missing something basic elsewhere in Jenkins. Is there a way to do this, without needing to customise the plugin?

I have installed Jenkins ver. 1.649 on a new Ubuntu 14.04 server (on a VirtualBox guest) and followed the README in the ghprb plugin (currently version 1.30.5), including setting up a jenkins "bot" account on Github as a collaborator to run all the integration API calls to Github.

I was wondering what the behaviour would be if I cloned the job (create new item and "Copy existing item"), and may try that next, but I expect that will result in the same job being run multiple times for no benefit as opposed to interacting smartly with other jobs polling the same pool of PRs.


回答1:


I have found the config setting whilst exploring more for the question.

It is really easy when you know which config item it is, but Jenkins has a lot of configuration to work through, especially when you are exploring the plugins.

The key thing is that the option to serve queued jobs in parallel (available executors allowing) is core Jenkins config, and not part of the Github PR builder.

So, just check the option Execute concurrent builds if necessary. This option should be found at the bottom of the first, untitled section of config. It is a really basic Jenkins option, that a newbie like me missed due to the mountain of other options.




回答2:


May be it is too late to answer this question, but after few days of researching I figured out a way to create multiple jobs per PR in github. The code I am showing here applies to github enterprise, but it works well enough for the general github(bitbucket) as well with a few tweaks in url and git command.

The mainline repository against which the PRs are created needs to have a file, I call it PRJob.groovy and contains

import groovy.json.JsonSlurper


gitUrl = GIT_URL

repoRestUrl = "${GITHUB_WEB_URL}/repos/${project}/${repo}"

def getJSON(url) {
  def conn = (HttpURLConnection) new URL(url).openConnection()
  conn.setRequestProperty("Authorization", "token ${OAUTH_TOKEN}");
  return new JsonSlurper().parse(new InputStreamReader(conn.getInputStream()))
}


def createPipeline(name, description, branch, prId) {

  return pipelineJob(name) {
      delegate.description description
      if (ENABLE_TRIGGERS == 'true') {
        triggers {
          cron 'H H/8 * * *'
          scm 'H/5 * * * *'
        }
      }
      quietPeriod(60)
      environmentVariables {
        env 'BRANCH_NAME', branch
        env 'PULL_REQUEST', prId
        env 'GITHUB_WEB_URL', GITHUB_WEB_URL
        env 'OAUTH_TOKEN', OAUTH_TOKEN
        env 'PROJECT', project
        env 'REPO', repo
      }
      definition {
        cpsScm {
          scriptPath "Jenkinsfile"
          scm {
            git {
              remote {
                credentials "jenkins-ssh-key"
                delegate.url gitUrl
                if (prId != "") {
                  refspec "+refs/pull/${prId}/*:refs/remotes/origin/pr/${prId}/*"
                }
              }
              delegate.branch branch
            }
          }
        }
      }
    }
}




def createPRJobs() {

  def prs = getJSON("${repoRestUrl}/pulls?state=open")

  if (prs.size() == 0) {

    def mergedPrs = getJSON("${repoRestUrl}/pulls?state=closed")
    if (mergedPrs.size() == 0) {
      throw new RuntimeException("No pull-requests found; auth token has likely expired")
    }
  }

  prs.each { pr ->
    def id = pr.get("number")
    def title = pr.get("title")
    def fromRef = pr.get("head")
    def fromBranchName = fromRef.get("ref")
    def prRepo = fromRef.get("repo")
    def repoName = prRepo.get("name")
    def prHref = pr.get("url")

    createPipeline("${repo}-PR-${id}-${fromBranchName}",
        "${prHref} Pull Request ${id}: ${title}", "origin/pr/${id}/head", id)
  }

}

createPRJobs()

This creates 1 jenkins job per PR. This relies on the project having a Jenkinsfile which can be picked up for running a peipeline job. A sample Jenkinsfile will look like below:

//Jenkinsfile for building and creating jobs
commitId = null
repoRestUrl = "${GITHUB_WEB_URL}/repos/${PROJECT}/${REPO}"

try{
   stage('Install and Tests') {
      runTest("Hello")
   }
   notify_github 'success'
}catch (Exception e) {
  notify_github 'failure'
  print e
  throw e
}

def runTest(String someDummyVariable) {
  node {
    checkout scm
    sh 'git clean -qdf'
    if (env.PULL_REQUEST == ""){
       sh 'git rev-parse --verify HEAD > commit.txt'
    } else {
        // We check out PR after it is merged with master, but we need to report the result against the commit before merge
       sh "git rev-parse refs/remotes/origin/pr/${env.PULL_REQUEST}/head^{commit} > commit.txt"

    }
    commitId = readFile 'commit.txt'
    echo commitId
    sh 'rm -f commit.txt'

    //Here goes your code for doing anything
    sh 'echo "Hello World!!!!!"'


  }
}


def http_post(url, rawJson) {
  def conn = (HttpURLConnection) new URL(url).openConnection()
  conn.setRequestProperty("Authorization", "token ${OAUTH_TOKEN}");
  conn.doOutput = true
  conn.requestMethod = "POST"
  conn.setRequestProperty("Content-Type", "application/json")
  def wr = new OutputStreamWriter(conn.getOutputStream());
  wr.write(rawJson);
  wr.close()

  def code = conn.getResponseCode()
  if (code < 200 || code >= 300){
    println 'Failed to post to ' + url
    def es = conn.getErrorStream();
    if (es != null) {
      println es.getText()
    }
  }
}


def notify_github(state) {

  http_post(
    "${repoRestUrl}/statuses/${commitId}",
    """
      { "state": "${state}",
       "target_url": "${env.BUILD_URL}",
       "description": "Build Pipeline",
       "context": "Build Pipeline"
      }
    """
  )
}

Hope this helps someone.



来源:https://stackoverflow.com/questions/35650629/how-to-run-continuous-integration-in-parallel-across-multiple-pull-requests

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!