Background
I have complex build system which is working and uses shared library. Pipelines code is stored in same git repository as shared library. Both sources are on master branch.
Problem
Now I do larger refactoring to improve build and test process. So I'm working on feature branch and I configured respective Jenkins job to test it.
Since I introduce changes also to shared library one thing is annoying: to import library I have to import this library this way:
#Library('my_library#feature') _
So to merge this changes to master I have to update code.
Is there a way to access branch (or other kind of reference) which current pipeline code was checkout?
So when I merge branches shared library follows to without altering code.
I was thinking something like this:
#Library("my_library#${PIPELINE_SOURCE_REF}") _
I search documentation and internet and didn't found anything like this.
Or is there an alternative solution?
If it is enough for you to use a parameter for the library branch, it is possible to do so, check out the shared libraries documentation
You would need to change:
#Library('my_library#feature') _
to
library("my_library#${params['BRANCH']}")
This should load the global vars.
If you need to instantiate some class, it is possible to do something like:
def someClass = library("my_library#${params['BRANCH']}").com.mypackage.SomeClass.new(this)
It has some limitations mentioned in the docs, depends on how your library looks like
On some none public repository I've found something like this in front of the pipeline.
def pipelineBranch = scm.branches[0].name
library("someLibrary#${pipelineBranch}")
Didn't test it yet, but it seams reasonable. scm.branches[0].name should contain name of branch used to checkout pipeline code.
Related
I am planning to create a Project in Jenkins and I am thinking of using Organisation folder for that.
Since the project has severals applications (mobile app with backend and frontend parts) I have several repos that will need to be separate jobs.
My question here is is it possible (or is it a good/bad practice as well) to put all Jenkinsfiles for all the apps in one separate folder (called Jenkinsfiles for example) from where I will invoke the corresponding file?
Until now I have been placing Jenkinsfile in the repo where is my app that I am doing the build but now, with whole project, I need to decide which approach to take so I would appreshiate any contribution in decision making
If you are asking "can it be done" the answer is Yes, most likely - but how will you manage which Jenkinsfile is being run?
As I see it, the main idea is to have the Jenkinsfile next to the code being deployed.
If you have several applications in your project repository, maybe you could have separate repositories with "just" the Jenkinsfile and deployment config. You could then use the Jenkinsfile in the main code repo to trigger all the sub-jobs.
I have done something similar, and have seen several companies with separate jobs for prod/user-test that require separate permissions to be triggered as well
Every example I've seen for Jenkins shared library setup on the web is based on Git/GitHub.
Can anyone help me with that using Subversion?
I've struggled a lot but could not figure out what should be specified as the Default version.
I've tried many different combinations of Project Repository Base, Include branches, Library Name and Default version but none worked.
Attached is the screenshot of my SVN repository setup. I know it's not as per the standards though, it should work somehow as it's just a demo project.
if your lib svn path is https://192.168.1.1:8443/svn/trunk/JenkinsLib, then Project Repository Base will be https://192.168.1.1:8443/svn/trunk/ and Default version is JenkinsLib.
While setting up the shared library in Configure System --> Global Pipeline libraries select Retrival Methond : Morden SCM and Source Code Management : Subversion like below picture :
and When you select Subversion it will ask you to choose subversion specific branching name like below:
I didn't want to tag a specific branch in SVN, so I just used a period (i.e. '.') in the DEFAULT VERSION field and that takes that HEAD of the repo.
The project repo base is just the svn:// path to your repo.
I hope that helps.
I have 11 jobs running on the Jenkins master node, all of which have a very similar pipeline setup. For now I have integrated each job with its very own Jenkinsfile that specifies the stages within the job and all of them build just fine. But, wouldn't it be better to have a single repo that has some files (preferably a single Jenkinsfile and some libraries) required to run all the jobs that have similar pipeline structure with a few changes that can be taken care of with a work around?
If there is a way to accomplish this, please let me know.
Use a Shared Library to define common functionality. Your 11 Jenkinsfiles can then be as small as only a single call to the function implementing the pipeline.
Besides using a Shared Library, you can create a groovy file with common functionality and call its methods via load().
Documentation
and example. This is an easier approach, but in the future with the increasing complexity of pipelines, this may impose some limitations.
I'm adding to, and maintaining, groovy files to build a set of repositories - previously they were built with freestyle Jenkins jobs. I support some code in shared libraries and to be honest (mainly for DRY reasons) I want to do that more.
However, the only way I know how to test and debug those library files is to push the changes on a git branch. I know about the "replay" trick to test the main Jenkins file. Is there some approach I've missed to do something similar for library code?
If you set up a job to load the shared library instead of relying on a globally set up shared library (you can have both going, for this particular job), then it is possible to hit "replay" and have all your shared library steps show up as editable files.
This can be helpful in iterative development without a million commits.
EDIT: Here's how that looks on an Organization job in Jenkins.
There is the 3rd party Jenkins Pipeline Unit testing framework.
While it does not yet cover all features of pipeline, it is well documented and maintained so that I would consider starting using it (once I revisit our Jenkins setup).
I am working on Jenkins pipeline for two projects. I built some customized configuration alerts messages via slack and emails. We expect my code can be used for my projects and also several other projects. So I am thinking to make it a small lib so that others don't need to ask me every time they onboard a Jenkins pipeline jobs. I was thinking using shared library with #Library() for other to use, as described in the docs.
However, since my lib depends on the existence of slack and emails plugin, it will not be usable when these plugin are not installed.
My question is: is there are way to declare dependency in pipeline Shared Libraries or I have to make jenkins plugin to address this issue?
As far as I know there is no way to declare dependencies to plugins right now (or version of Jenkins). Instead, what you can do is add a check for the plugin and give a proper error to the user of your library:
if (Jenkins.getInstance().getPluginManager().getPlugin("Slack+Plugin") == null) {
error "This shared library function requires "Slack plugin!"
}
Put this at the start of your shared library script, before any uses of the plugin. Note though, this gets tricky if you need to import classes from a plugin (since imports goes first in the groovy file). What you do in that situation is to make two scripts, the first script has the check and is the one the user calls, the second contains all the logic and imports, and is called by the first script once the checks pass.