We own a Github organization with hundreds repositories that are authored by contributors. We would like to setup a Jenkins server that performs certain standard tasks for e
Answering my own question:
As several people had pointed out, Jenkins assumes one job per repository. The Github Organization plugin didn't work well because it is clumsy and requires you commit and maintain a Jenkinsfile
to each of your repos, which is specifically what I wanted to avoid.
The critical piece of information that I unaware of is that Jenkins has an excellent CLI and REST api to control jobs, and a single job configuration can easily be exported as a simple xml file.
So what I did is setup a the Jenkins job for one of the repos via the Jenkins GUI. Then I wrote a simple REST client that downloads the config.xml
for this jobs, and creates or updates the Jenkins jobs for each of the repositories in our Github organization.
The builds automatically get triggered by Github (organization-wide) webhook if the URL matches that of any of the repositories. No special Github Organization plugin is needed.
The standard approach would be to create a new multibranch pipeline which scans your organization for new repositories. Every repository should have a jenkinsfile
with the instructions to build. But in general it is also possible to achieve what you are trying on a programmatical way.
What my approach would be:
I would use the Folders Plugin to create a folder for this type of jobs.
If that is what you are really trying to do I could elaborate further.
I am not sure how much of this answer will help you, but I will be happy even if it provides some insight into Jenkins pipelines.
I am elaborating the procedure to follow using Jenkins pipelines, if not now at some point of time you need to move your build and deploy to pipelines for Infrastructure as code.
Starting with Jenkins plugins, the following are mandatory plugins for the procedure that I will be explaining here:
Jenkins Configuration
Github organization
from the options below:Owner
should be your Organization
where hundred of repos are available.also, configure what file and what branches to look into a repo to trigger a build. script path
is the file that does the steps (probably build and deploy) for the repos. So all the repos will be detected or shown in Jenkins only if a file with this name is available in the repos.
Jenkins scans the configured organization as per the interval mentioned here. It detects any additions/deletions of repos and also commits. Good to configure numbers of builds to store, as needed.
Git repo/organization configuration
Configure webhooks in github
Configure the events that require notifications to Jenkins.
Branch protection and status checks for PRs
Protecting the branch by enabling proper checks will help to restrict commits from a few sets of people after status checks are passed. This helps to maintain good code quality.
Here is the snapshot that shows the status checks status when a PR is raised. Based on this reviewers will be able to decide for approving the PR.
This link explains in detail about the procedure that I have described here.
https://github.com/gitbucket/gitbucket/wiki/Setup-Jenkins-Multibranch-Pipeline-and-Organization
Assuming your Jenkins are running in a Linux or MacOS, or in Windows supporting Shell Script commands, configure a Jenkins job to execute the script below. Don't forget to replace the user and password fields and read the comment lines in order to understand and maybe improve it.
curl -i -u "user":"password" "https://github.com/your_organization" | grep "codeRepository" | awk -F '"' '{print $8}' | while read line; do
mkdir "_temp_repo"
cd "_temp_repo"
# `--depth=1` clones the last commit only improving the speed of the clone
# if you want to clone a specific branch add `-b branch` to the command below
git clone --depth=1 "https://github.com"$line .
# execute here your peding commands...
git add .
git commit -am "[pending] XPTO..."
git push
# execute here your success/failure commands...
git add .
git commit -am "[success/failure] XPTO..."
git push
cd ..
rm -rfv "_temp_repo"
done
I would suggest create a SH file and execute in verbose mode: sh -x ./my_script.sh
.
In order to perform it for every new update, setup a Github webhook to this job.
You have more than one requirement here. Let's go through one by one.
a) Jenkins GitHub Organization: This will scan all your GitHub organization, and create as many jobs as are needed to build your repositories because having just one job on Jenkins is not the standard. Basically, you lost history data (Jenkins has no idea it is building different stuff at every iteration). It says on the help "Scans a GitHub organization (or user account) for all repositories matching some defined markers.".
b) Try to see Jenkins as an automator, not something which will host all build/deploy logic. What I do is to create files like "build.sh", "deploy.sh", and so on. This way, I can build and deploy directly from my shell. So only after that I create scripts for Jenkins, that just call build/deploy scripts, no matter what they actually do. Jenkins doesn't need to know. A side effect is that all your projects "can be built the same way", no matter if they are NodeJS, Python, or whatever. Of course, you might need extra dependencies in some cases, and Docker can really help here.
c) I did something similar in the past, having less jobs than repositories/branches/pull-requests. Jenkins is kind of dump, and a few plugins can help here. But in your case, if you really want to have one job, you only need a regular parametrized job. The trick is that your Github Organization global webhook will not point to Jenkins. It needs to point to somewhere else, some code you maintain. This code can parse the Github payload, analyze it, can eventually calls back GitHub ("is there a pull request for this branch? no? then forget it") to enhance its decision tree, and at the end, trigger that single job on Jenkins with all parameters you were able to capture. Such parameters will tell the single job which repo to clone, env to deploy to, and that is it. You already know scripts names, since they are standard.
d) That said, I would ask... do you need a Jenkins? Can this parser-little-software actually clone your repo, and run a few scripts inside a Docker container? A builder-docker-container which has every dependency inside?
e) About "talking back" to GitHub, I did that using Python. There are GitHub libraries so I was able to get stuff back from Jenkins, and do API posts to feed GitHub with build statuses. Since I was actually using a Jenkins instance, my tool was a man-in-the-middle-broker. In your case, for a single job, a Docker container will play the role nicely.
Hopes this helps with a different perspective.
If you actually want to use a Jenkins instance, most of what I said here can still be used.