Jenkins : Change the name of JenkinsFile - jenkins

I'm using Pipeline Plugin under Jenkins
My job is basically using a file called "jenkinsFile" to get the divers steps to run.
-> My purpose is how to let the job use a different file name :
examples:
myJenkinsFile
build_JenkinsFile
deploy_JenkinsFile
buildSteps
...
Since it seems that "JenkinsFile" is a conventional format ,
is there any ways to change it if it's not verry clean ??
Suggestions ??

On the project section of the configuration page you just have to click Add > Pipeline Jenkins and then you can choose the custom name that jenkins will look for the pipeline.
If you want also a better level of customization you can also use Remote File Plugin, which allows you to put your pipeline in a repository and make it work with multiple repositories/branch (and of course you can still customize the name of the file)

Related

Jenkins: Is there a way I can load a file as a jenkins build parameter?

I currently have a git repo which has a text file. I want to load its contents as a build parameter for a jenkins job.
One way would be to manually copy the contents of this file in Jenkins multi-line string parameter. But, since the content is in git already I want to keep it coupled.
Not sure, if this is even possible using Jenkins?
I am using Jenkins Job DSL to generate the job.
EDIT : You can find several different ways of achieving this in the following answer Jenkins dynamic declarative pipeline parameters
I think you can achieve it the following way (scripted pipeline).
node {
stage("read file") {
sh('echo -n "some text" > afile.txt')
def fileContent = readFile('afile.txt')
properties([
parameters([
string(name: 'FILE_CONTENT', defaultValue: fileContent)
])
])
}
stage("Display properties") {
echo("${params.FILE_CONTENT}")
}
}
The first time you execute it there will be no parameter choice. The second time, you'll have the option to build with parameter and the content will be the content of your file.
The bad thing with this approach is that it's always in sync with the previous execution, i.e. when you start the build on a commit where you changed the content of your file, it will prefill the parameter with the content of the file as per the last execution of the build.
The only way I know around this, is to split your pipeline into two pipelines. The first one reads the content of the file and then triggers the second one with the file content as build parameter with the build step.
If you find a better way let us know.
Why don't you have jenkins pull the repo as part of the Job, and then parse the contents of the parameters (in say, json, for example from a file within the repo) and then continue executing with those parameters?

Key Value store option for Jenkins

Is there any plugin available for Jenkins which provides a key-value store option for Jenkins?
The plugin which's functionality is close to that is the credentials plugin.
The goal is to have a plugin which stores global configuration parameters and this parameters are available to Jenkins jobs.
Go to Manage jenkins -> Configure System -> Global Properties -> Environment Variables:
Check the box and Click on ADD
Enter Key-value and Save
To access the variable simply ${<Your-key>}
Could the enthronement variables fit your need?
They are like regular shell variable.
If you are using the pipeline you can define it this way:
environment {
VAR = 'your_value'
}
and use it later in your build.
This is explained there: https://jenkins.io/doc/pipeline/tour/environment/
If you are writing your pipeline from the UI, you can add a 'source' step in your build step.
source your_environnement_setting
test='Hello'
And then the variables can simply be used like any shell var:
echo $test
If you have variables that you do not know in advance, but you know when you are triggering your job, you can also use the parametrized plugin:
https://wiki.jenkins.io/display/JENKINS/Parameterized+Build

Create Jenkins WorkflowMultibranchProject job with groovy init

I am automating the configuration of Jenkins masters to get to a one-click instantiation. We have 6 standard jobs we create for each instance and I'd like to be able to create them via groovy.init.d scripts but haven't found examples for this type of job.
We use the cloudbees Bitbucket Team/Project plugin that ends up creating jobs of type WorkflowMultibranchProject with additional configuration to connect to our on-prem Bitbucket instance.
Does anyone have samples of creating such a job via groovy? Am I better off trying to use JobDSL to create the job (am doing that already for a Mother Seed job)
[UPDATE] : with the help of the answer below came up with a full sample creating an entire Bitbucket Team/Project Job: https://github.com/redfive/jenkins-init/blob/master/init.groovy.d/core-jobs.groovy
Having used Job DSL, I'm 50/50 undecided if it is easier compared to using Groovy (as Job DSL lacks support for some of the config options).
An example for the similar OrganizationFolder can be found in #coderanger's article on https://coderanger.net/jenkins/:
// Create the top-level item if it doesn't exist already.
def folder = jenkins.items.isEmpty() ? jenkins.createProject(OrganizationFolder, 'MyName') : jenkins.items[0]
// Set up GitHub source.
def navigator = new GitHubSCMNavigator(githubOrg)
navigator.credentialsId = cred.id // Loaded above in the GitHub section.
navigator.traits = [
// Too many repos to scan everything. This trims to a svelte 265 repos at the time of writing.
new jenkins.scm.impl.trait.WildcardSCMSourceFilterTrait('*-cookbook', ''),
// We have a ton of old branches so try to limit to just master and PRs for now.
new jenkins.scm.impl.trait.RegexSCMHeadFilterTrait('^(master|PR-.*)'),
new BranchDiscoveryTrait(1), // Exclude branches that are also filed as PRs.
new OriginPullRequestDiscoveryTrait(1), // Merging the pull request with the current target branch revision.
]
folder.navigators.replace(navigator)
The next time when I set up an instance, I'd likely give that a try.

How do I write a Jenkins pipeline function in order to be able to use it as an option?

I would like to add general functionality to my Jenkins pipeline script, similar to what you can do with built-in functions (timestamps, ansiColor):
options {
timestamps()
ansiColor 'xterm'
// failureNotification() <- I want to add this
}
How do I write a function in the script so that it can be used as an option?
Currently I don't believe that's possible with the declarative syntax that you're using. You could write your own Jenkins plugin to do this, but that could get hairy.
If you're willing to use a slightly more complicated syntax, I would look at this blog post: https://jenkins.io/blog/2017/02/15/declarative-notifications/
Essentially, you'll need to create a shared groovy library and use the step from that to manage your notification step. There's a few steps to this:
Create a repository for your shared library. This should have a folder called "vars", which is where your steps and step documentation goes.
Create a step in your shared library. Using camelCase and a groovy extension, create a file name to describe your step. This is what you will call in your Jenkinsfile. Ex: sendFailureNotification.groovy
Within that file, create a function with the name of call. You can use whatever parameters you want. Ex: def call() { }
That call function is your "step logic". In your case, it sounds like you would want to look at the build result and use whatever notification steps you feel are necessary.
Copying from the documentation... 'To setup a "Global Pipeline Library," I navigated to "Manage Jenkins" → "Configure System" in the Jenkins web UI. Once there, under "Global Pipeline Libraries", I added a new library.'
Import your library into your Jenkinsfile like so: #Library('<library name you picked here>')
Now you should be able to call sendFailureNotification() at the end of your Jenkinsfile. Maybe in a post stage like so?:
post {
failure {
sendFailureNotification()
}
}

1 jenkins job trigger multiple jenkins jobs based on parameters

Is there any Jenkins plugin that helps with the following:
if a directory <XXX*, is present in SVN folder <GoRoCo>
then the <GoRoCo>_<XXX> Jenkins job is called
?
Example:
In job "TEST" , I specify parameters like directory name (A, B , C) and folder name (G1R2) then job "TEST" should trigger the jobs "G1R2_A" , "G1R2_B" and "G1R2_C"
Use Parameterized Trigger Plugin. When specifying jobs to call in the plugin you can use tokens, as in JOB_${PARAM1}_${PARAM2}.
Take a look at that plugin, i think it does exactly what you are looking for:
https://wiki.jenkins-ci.org/display/JENKINS/Files+Found+Trigger
Use Build Flow plugin
With the help this plugin you can run as many jobs with or without parameter.
Use some scripts to create property file with the required parameters for each of the modified project and place them in the workspace directory.
Later you can use parameterised plugin to trigger downstream project like this.
Note: you might also have to delete those properties after triggering the down stream projects.

Resources