Jenkins declarative pipeline parameters - jenkins

Is there a way to pass in a parameter to a Jenkinsfile from the organization folder instead of the job level or global level?
Essentially what I'm looking to do is have one Jenkinsfile that handles whatever situation I need, and have multiple organization folders that send it parameters. So basically I can have one organization folder that scans and grabs all of the feature branches, and when I run one of the jobs it merges them to develop. Another one that grabs all of the develop branches, and when I run one of the jobs it just builds them. etc.
I need some way to pass parameters to my Jenkinsfile to say "Hey I'm this folder, this is what you should do". I can't find a way to do so. I thought of making multiple Jenkinsfiles but it would be confusing to know which one to place in each repo. I would change the names of the Jenkinsfiles so it's obvious which one to use, but the only option I get for "Project Recognizer" in the configuration is "Pipeline Jenkinsfile" so I don't know how I can change the names and the organization folder still recognize it.
Is there something I'm missing? Any way to send a parameter to my Jenkinsfile from the folder instead of a global level? Or is there some other way to solve my problem and be able to tell my Jenkinsfile what to do depending on what organization folder it is in inside of Jenkins?

Or is there some other way to solve my problem and be able to tell my Jenkinsfile what to do depending on what organization folder it is in inside of Jenkins?
A simple way to check in which organization folder job is built is to parse it from env.JOB_NAME parameter. For example:
Jobs hierarchies:
feature/job1
feature/job2
production/job1
production/job2
To make Jenkins Pipeline to do different functionality whether they are in feature or production organization:
def topFolder = env.JOB_NAME.split('/')[0]
// In code somewhere else:
if (topFolder == 'feature') {
doSomething()
} else if (topFolder == 'production') {
doOther()
}

Related

Jenkins query on building parameters and separate jobs

I'll have to admit that I'm new to Jenkins and I would like to build a process (a pipeline I might think it is called) with its help.
So I have a github repository with some folder structure. I.E:
/env1/env1-prod
/env1/env1-test
/env2/env2-prod
/env2/env2-test
...
I would like Jenkins to pull changes from my repository (this is completed) and if it sees changes in folders /env1/env1-test and /env2/env2-prod I would like Jenkins to create:
env1-test and env2-prod parameters so that I would use them as part of other jobs,
2 separate jobs which would use the above parameters separately
Is that possible? If so can you give some online resources that I learn from?
Thanks in advance.

How create a drop-down list in Jenkis pipeline?

I created a Jenkins pipeline that creates a backup of an Oracle Database and storage in a GitLab repository and, if necessary, execute a rollback. I want to create a drop-down list with a file in this GitLab repository that can select exactly what backup I want to execute. This is in a Stage block of Jenkins pipeline. It is possible?
Thanks
You can't create a dropdown list within a stage block in Jenkins, because the pipeline variables have already been determined by that point (from the properties([ parameters ([])]) blocks).
You can create a select list in the parameters block, but that wouldn't allow you to dynamically select from a list of files. Alternatively you could create a bunch of manual jobs based on a list of files, and kick off just the ones you need, but this doesn't seem like a CI/CD pattern that will scale well. You may want to figure out a better way to perform this job.

How to reusably define an ordered list of build-steps for Jenkins and an arbitrary script?

Situation: I have a Jenkins pipeline job file with lots of (regularly changing) stages hard-coded with Groovy and I need a way to locally reproduce, what being done on CI.
In order to let the developer locally do, "what Jenkins would do" without having to manually keep a list of steps synchronized with the respective Jenkinsfile I'm looking for a way to store the ordered list of stages accessible by both Jenkins and a local script.
In other words - I want to be able to check out my project repository and run
make what-Jenkins-would-do
(make is only an example - I really don't want to use make for this)
So, given a set of scripts which contain what's being executed on each stage I now just need the order of execution stored in a sophisticated way.
What I'd love to have is a way to let Jenkins read a list of pipeline steps from a Yaml/JSON file which then can be used in other scripts, too.
Here is what's going through my mind
I can't be the only one - there must be a tiny nice solution for this need
maybe I could use Groovy locally, but that would add another heavy dependency to my project and the Groovy-scripts contain lot's of Jenkins- and node-specific stuff I don't need locally
Don't want to store information redundantly
Just executing a 'do it all' script in both Jenkins and locally is not an option of course - I need individual stages.
Jenkins / Groovy and pipeline jobs are a requirement - I can't change that
So what's the modern solution to this? Is there something like
node('main') {
stage('checkout') {
// make the source code available
}
stages_from_file("build-stages.yaml")
}
?

In Jenkins, can I find a job by its display name?

I have a pipeline project with a stage in which a unique identifier is retrieved from an external system and set as the job's display name. I know this identifier to be unique for my whole Jenkins installation, so any search by this key should return exactly zero or one job.
Is there any way I can get a job number/URL (or a list of jobs containing only this one job) given its display name and the project name?
EDIT: I want to find jobs from outside of Jenkins, via user interface or REST API, not from a pipeline.
Take look at Jenkins instance. You can retrieve all Jobs, Folders, ... using the getItem(string)/getItems() methods.
Hardly ideal, but I ended up using the Script Console to list all the jobs, and then used my browsers text search.
Navigate to http://your-jenkins-instance.org.com/script (or Jenkins → Manage Jenkins → Script Console) and run:
Jenkins.instance.getAllItems(AbstractItem.class).each {
println(it.fullName)
};

Generic jenkins jobs

I wonder, if it is possible to create generic parametric jobs ready to copy where the only post copy action is to redefine its parameters.
During my investigation I find out that:
- I can use parameters in svn path definition
- I can define the flow of builds using *Build Flow Plugin*
However I cannot force Jenkins to use parameters inside job names definition for promotion process. Is any way to achieve that?
Since I create sometimes branches from master I would like to copy the whole configuration of jobs but only one difference most times is that in the job name definition I replace master with branch name.
Why it not possible to use a parameter as the branch name?
Then when you start to build the job, you can input the parameters based on the branch you want to build.
Or find some way to get the branch info and inject it.
For example, we have one common job, which will monitor maybe 20 projects, if any of those job was merged into git, our gerrit trigger plugin will start to work, and all of job name, and branch is got dynamically from parameters.
Hope this works for you.

Resources