Jenkins query on building parameters and separate jobs - jenkins

I'll have to admit that I'm new to Jenkins and I would like to build a process (a pipeline I might think it is called) with its help.
So I have a github repository with some folder structure. I.E:
/env1/env1-prod
/env1/env1-test
/env2/env2-prod
/env2/env2-test
...
I would like Jenkins to pull changes from my repository (this is completed) and if it sees changes in folders /env1/env1-test and /env2/env2-prod I would like Jenkins to create:
env1-test and env2-prod parameters so that I would use them as part of other jobs,
2 separate jobs which would use the above parameters separately
Is that possible? If so can you give some online resources that I learn from?
Thanks in advance.

Related

Generating Jenkins Jobs WIth JobDSL and a Template

I have a lot of jenkins jobs which are very similar. They all have the same input parameters and same build steps so i was wondering whether theres a way to generate these jobs through Job DSL and a template.
I would like to keep separate groovy files for each job which has all the parameters for the template (for e.g: github repo, env, etc). Job dsl will read through these parameter files and uses the template to generate new jenkins jobs.
Do you think this is possible, Ive looked everywhere for a solution but it doesnt seem like something like this exists

Jenkins declarative pipeline parameters

Is there a way to pass in a parameter to a Jenkinsfile from the organization folder instead of the job level or global level?
Essentially what I'm looking to do is have one Jenkinsfile that handles whatever situation I need, and have multiple organization folders that send it parameters. So basically I can have one organization folder that scans and grabs all of the feature branches, and when I run one of the jobs it merges them to develop. Another one that grabs all of the develop branches, and when I run one of the jobs it just builds them. etc.
I need some way to pass parameters to my Jenkinsfile to say "Hey I'm this folder, this is what you should do". I can't find a way to do so. I thought of making multiple Jenkinsfiles but it would be confusing to know which one to place in each repo. I would change the names of the Jenkinsfiles so it's obvious which one to use, but the only option I get for "Project Recognizer" in the configuration is "Pipeline Jenkinsfile" so I don't know how I can change the names and the organization folder still recognize it.
Is there something I'm missing? Any way to send a parameter to my Jenkinsfile from the folder instead of a global level? Or is there some other way to solve my problem and be able to tell my Jenkinsfile what to do depending on what organization folder it is in inside of Jenkins?
Or is there some other way to solve my problem and be able to tell my Jenkinsfile what to do depending on what organization folder it is in inside of Jenkins?
A simple way to check in which organization folder job is built is to parse it from env.JOB_NAME parameter. For example:
Jobs hierarchies:
feature/job1
feature/job2
production/job1
production/job2
To make Jenkins Pipeline to do different functionality whether they are in feature or production organization:
def topFolder = env.JOB_NAME.split('/')[0]
// In code somewhere else:
if (topFolder == 'feature') {
doSomething()
} else if (topFolder == 'production') {
doOther()
}

Can a single seed job process DSLs from multiple repos?

I recently managed to convert several manually-created jobs to DSL scripts (inlined into temporary 'seed' jobs), and was pleasantly surprised how straightforward it was. Now I'd like to get rid of the multiple seed jobs and try to structure things more cleanly.
To that end, I created a new jenkins-ci repo and committed all the Groovy DSL scripts to it. Then I created a job-generator Jenkins job that pulls from the jenkins-ci repo and has a single Process Job DSLs step. This step has the Look on Filesystem box ticked, with the DSL Scripts field set to jobs/*.groovy. With global push notifications already in place, this works more-or-less as intended: if I make a change to the jenkins-ci repo, the job-generator job automatically runs and regenerates all the jobs—awesome!
What I don't like about this solution is that it has poor locality of reference: the DSL scripts for the job live in a completely separate repository from the code. What I'd really like is to keep the job DSL scripts in each individual code repository, in a jenkins subfolder, and have a single seed job that processes them all. That way, changes to CI setup could be code-reviewed right alongside the code. To me, that just feels like an ideal setup.
Unfortunately, I don't have a clear idea about how to make this happen. If I could figure out a way to make the seed job watch multiple repos, such that a commit to any one of them would trigger it, perhaps I could inject another build step before the Process Job DSLs step and (somehow) script my way to victory, but... I'm unsure how to even get to that point. (I certainly don't want to do full clones of each repo in the generator job just to pull in the DSL scripts!)
I suspect I'm not the first person to wish they could put the Job DSL scripts alongside the code, though perhaps I'm over-estimating the benefits. Any advice on this topic would be much appreciated—thanks!
Unfortunately there is no direct way of solving this. Several feature requests have been opened (JENKINS-33275, JENKINS-37220), but AFAIK no one is working on any of them.
As a workaround you can use the Pipeline Multibranch Plugin and create a multibranch project for each of your repositories. You must then add a simple Jenkinsfile to each repo/branch and use the Jenkinsfile to execute your Job DSL scripts. See Use Job DSL in Pipeline scripts for details. This would require minimal coding, but I think each repo must be cloned for this to work because the Job DSL files must be available on the file system.
You can use Job DSL to create the multibranch jobs, see multibranchPipelineJob in the API viewer. This would be your "root" seed job.
If your repos are hosted on GitHub, you can also checkout the GitHub Organization Folder Plugin. With that plugin you must only create one job for each organization instead of multiple multibranch jobs.

Generic jenkins jobs

I wonder, if it is possible to create generic parametric jobs ready to copy where the only post copy action is to redefine its parameters.
During my investigation I find out that:
- I can use parameters in svn path definition
- I can define the flow of builds using *Build Flow Plugin*
However I cannot force Jenkins to use parameters inside job names definition for promotion process. Is any way to achieve that?
Since I create sometimes branches from master I would like to copy the whole configuration of jobs but only one difference most times is that in the job name definition I replace master with branch name.
Why it not possible to use a parameter as the branch name?
Then when you start to build the job, you can input the parameters based on the branch you want to build.
Or find some way to get the branch info and inject it.
For example, we have one common job, which will monitor maybe 20 projects, if any of those job was merged into git, our gerrit trigger plugin will start to work, and all of job name, and branch is got dynamically from parameters.
Hope this works for you.

Taking out common config of jenkins jobs

I have about 200 jenkins, each of them has a long config page but actually most config are the same. Everytime when I need to update something in the common config, I write a groovy script to loop though those jobs and update them one by one. It's a pain because it takes about 5 minutes to update those jobs by the groovy script. I am wondering is there a jenkins plugin(or something else) that I can use to put the common config in one place? jenkins slicing plugin doesn't work well, I think it conflicts with another plugin.
Thanks
Sounds like a job for the job-dsl plugin
From the wiki page
The Jenkins job-dsl-plugin attempts to solve this problem by allowing
jobs to be defined with the absolute minimum necessary in a
programmatic form, with the help of templates that are synced with the
generated jobs. The goal is for your project to be able to define all
the jobs they want to be related to their project, declaring their
intent for the jobs, leaving the common stuff up to a template that
were defined earlier or hidden behind the DSL.

Resources