TL;DR Is there a way to import code into the Jenkinsfile
from the local repository (other than the load
step)?
I\'ve ex
I've successfully tested a simple workaround for GIT repos having both protocols - HTTPS and SSH (I'm using BitBucket) - just configure the Jenkins job as follows (pointing to the same repository but forcing different fetch methods):
In the Branch Sources add "Checkout over SSH" option
In the Pipeline Libraries -> Source Code Management -> Project Repository use HTTPS protocol - e.g. something like https://...
Workaround:
library identifier: 'shared-library@version', retriever: legacySCM(scm)
The approach currently taken in PR 37 will not work properly with build agents, and anyway will only work for scripts using the library
step, not the @Library
annotation.
By the way files loaded from the load
step do appear in Replay. But it is true that your script cannot statically refer to types defined in such files. In other words, you could simulate library vars/*.groovy
but not src/**/*.groovy
—the same limitation as the current PR 37.
Is there a way to import code into the Jenkinsfile from the local repository (other than the load step)?
Is there a way (or workaround) to create a shared library in the same repository as the Jenkinsfile and import this library into the Jenkinsfile?
Yes. Provided that the "directory structure of a Shared Library repository" is observed according to specification, then it's absolutely feasible and no workaround is required to get it working. Basically, your directory structure would require an adjustment along the following lines:
+- src # Groovy source files
| +- org
| +- foo
| +- Bar.groovy # for org.foo.Bar class
+- vars
| +- foo.groovy # for global 'foo' variable
| +- foo.txt # help for 'foo' variable
+- resources # resource files (external libraries only)
| +- org
| +- foo
| +- bar.json # static helper data for org.foo.Bar
+- someModule
| +- ...
|
|
+- Jenkinsfile
This answer is not based on a conjecture. Although it's not documented, I have applied this pattern on multiple projects and training, and it works. You do not need to tinker with your job configuration in anyway than is usual.
You may take a look at plugin I wrote, that enables using subdirectories of repo where your pipeline is as shared libraries: https://github.com/karolgil/SharedLibrary
After building and installing it, you can simply put following in your pipeline:
@SharedLibrary('dir/in/repo') _
To start using dir/in/repo
as shared library for your pipelines.
Wanted to do the same and ended up creating this:
https://github.com/jenkinsci/workflow-cps-global-lib-plugin/pull/37
and here is how I use it:
https://github.com/syndesisio/syndesis-pipeline-library/blob/master/Jenkinsfile#L3
In my case I wanted to create a Jenkinsfile that actually tests
the pipeline library that the repository contains.
Let me know what you think and feel free to add your comments on the PR too.
I guess that proper way to do that is to implement a custom SCMRetriever
.
However, you can use the following hack:
Assuming jenkins/vars/log.groovy
in your local repo contains:
def info(message) {
echo "INFO: ${message}"
}
Your Jenkinsfile
can load that shared library from the jenkins/
directory using library
step:
node('node1') { // load library
checkout scm
// create new git repo inside jenkins subdirectory
sh('cd jenkins && git init && git add --all . && git commit -m init &> /dev/null')
def repoPath = sh(returnStdout: true, script: 'pwd').trim() + "/jenkins"
library identifier: 'local-lib@master', retriever: modernSCM([$class: 'GitSCMSource', remote: repoPath])
}
node('node2') {
stage('Build') {
log.info("called shared lib") // use the loaded library
}
}