I have a repository with multiple Jenkinsfiles (at least there will be multiple Jenkins files) and I want to setup the Jobs in Jenkins using a SEED job.
So far I can set up one job based on my remote repository.
#!/usr/bin/env groovy
/*
* Setup jobs from gitlab project docker-jenkins-pipelines
*/
def createPipelineJob(final String repo) {
String repoName = repo.substring(repo.lastIndexOf("/") + 1, repo.length())
pipelineJob(repoName) {
definition {
cpsScm {
scm {
git {
remote {
url('git#gitlab.com:' + repo +'.git')
}
branches('*/main')
//branches('*/feat*')
}
}
scriptPath("src/main/jobs/ADMIN-initialize-repository/Jenkinsfile")
}
}
}
}
createPipelineJob('sommerfeld.sebastian/docker-jenkins-pipelines')
Now I would like to iterate all folders in my repo (https://gitlab.com/sommerfeld.sebastian/docker-jenkins-pipelines/-/tree/main/src/main/jobs) and create separate jobs for all Jenkinsfiles.
I would like to have some sort of wildcard for src/main/jobs/*/Jenkinsfile. But looping the folder would be okay too and mybe even better because I could better define the jobnames.
But I don't know how to iterate the folders. Can anyone give me a hint on how to do that? Is there an APi call for gitlab.com or something?
I would suggest to not use the API. You do have groovy at hand, and you can iterate through the files. When you checkout the repository you have all information.
https://stackoverflow.com/a/38899519/3708208 is a good starting point to iterate over the files with groovy, there might be some sandbox security limitations, but this shows how you can iterate over a set of files. Calling the method to create the pipeline jobs should be something like:
new File(parentPath).traverse(type: groovy.io.FileType.FILES, nameFilter: ~/Jenkinsfile/) { it ->
createPipelineJob("sommerfeld.sebastian/docker-jenkins-pipelines/${it.parent.name}")
} //code untested :)
Related
I've been trying to construct multiple jobs from a list and everything seems to be working as expected. But as soon as I execute the first build (which works correctly) the parameters in the job disappears. This is how I've constructed the pipelineJob for the project.
import javaposse.jobdsl.dsl.DslFactory
def repositories = [
[
id : 'jenkins-test',
name : 'jenkins-test',
displayName: 'Jenkins Test',
repo : 'ssh://<JENKINS_BASE_URL>/<PROJECT_SLUG>/jenkins-test.git'
]
]
DslFactory dslFactory = this as DslFactory
repositories.each { repository ->
pipelineJob(repository.name) {
parameters {
stringParam("BRANCH", "master", "")
}
logRotator{
numToKeep(30)
}
authenticationToken('<TOKEN_MATCHES_WITH_THE_BITBUCKET_POST_RECEIVE_HOOK>')
displayName(repository.displayName)
description("Builds deploy pipelines for ${repository.displayName}")
definition {
cpsScm {
scm {
git {
branch('${BRANCH}')
remote {
url(repository.repo)
credentials('<CREDENTIAL_NAME>')
}
extensions {
localBranch('${BRANCH}')
wipeOutWorkspace()
cloneOptions {
noTags(false)
}
}
}
scriptPath('Jenkinsfile)
}
}
}
}
}
After running the above script, all the required jobs are created successfully. But then once I build any job, the parameters disappear.
After that when I run the seed job again, the job starts showing the parameter. I'm having a hard time figuring out where the problem is.
I've tried many things but nothing works. Would appreciate any help. Thanks.
This comment helped me to figure out similar issue with my .groovy file:
I called parameters property twice (one at the node start and then tried to set other parameters in if block), so the latter has overwritten the initial parameters.
BTW, as per the comments in the linked ticket, it is an issue with both scripted and declarative pipelines.
Fixed by providing all job parameters in each parameters call - for the case with ifs.
Though I don't see repeated calls in the code you've provided, please check the full groovy files for your jobs and add all parameters to all parameters {} blocks.
I have a Jenkinsfile which as one of its stages builds several Docker images and pushes them to a registry. There is quite a long and growing list of these images, so I don't want to repetitively declare the build. Instead, I have a variable:
def dockerImages = ["myimage1","myimage2","myimage3"]
And then have the following stage:
stage("Initiate docker image builds") {
steps {
script {
dockerImages.each { image ->
stage ("${image}") {
utils.doStuff(${image}
}
}
}
}
}
I only want build to happen when there is a change, so I could do something like:
stage("Initiate docker image builds") {
when{
changeset "dockerfiles/**"
}
steps {
script {
dockerImages.each { image ->
stage ("${image}") {
utils.doStuff(${image}
}
}
}
}
}
But this would trigger building all the images if there was a change on just one of them. Is there a way that I could modify my script to have the when apply to the inner stage ("${image}" section? The syntax doesn't appear to allow when on that level.
You can inspect the Changeset to see what files have changed e.g.
https://issues.jenkins.io/browse/JENKINS-58441
Looks quite messy code though.
Could also break the dockerfiles out to their own repos. It might seem "wasteful" having a repo for a single file, but it totally removes situations like this.
Or have separate Jenkinsfiles and jenkins jobs for each container build which just look it its Dockerfile has changed. Would need a lot of executors though if you had lots of containers
I am creating a Jenkins pipeline, I want certain stage to be triggered only when a particular log file's(log file is located in the server node where all the stages are going to run) last modified date is updated after the initiation of pipeline job, I understand we need to use "When" condition but not really sure how to implement it.
Tried referring some of the pipeline related portals but could not able to find an answer
Can some please help me through this?
Thanks in advance!
To get data about file is quite tricky in a Jenkins pipeline when using the Groovy sandbox since you're not allowed to do new File(...).lastModified. However there is the findFiles step, which basically returns a list of wrapped File objects with a getter for last modified time in millis, so we can use findFiles(glob: "...")[0].lastModified.
The returned array may be empty, so we should rather check on that (see full example below).
The current build start time in millis is accessible via currentBuild.currentBuild.startTimeInMillis.
Now that we git both, we can use them in an expression:
pipeline {
agent any
stages {
stage("create file") {
steps {
touch "testfile.log"
}
}
stage("when file") {
when {
expression {
def files = findFiles(glob: "testfile.log")
files && files[0].lastModified < currentBuild.startTimeInMillis
}
}
steps {
echo "i ran"
}
}
}
}
I've put all my Jenkins logic in a structured pipeline script (aka Jenkinsfile).
If something goes wrong, i m sending mails. For the subject i want to use the displayName of the job and not the jobs id env.JOB_NAME (as they are driven by access control patterns and not readability).
With a normal pipeline job i could use currentBuild.rawBuild.project.displayName but for multibranch pipelines this is just the branch name.
Or is there a even better way to get the userfriendly name, then traversing the rawBuild?
For now i found no convinient public api, so this seems to be the way to go:
String getDisplayName(currentBuild) {
def project = currentBuild.rawBuild.project
// for multibranch pipelines
if (project.parent instanceof WorkflowMultiBranchProject) {
return "${project.parent.displayName} (${project.displayName})"
} else {
// for all other projects
return project.displayName
}
}
I use currentBuild.fullProjectName which is set to multibranch_pipeline_name/branch_name or pipeline_name depending if you are using a multibranch pipeline or normal pipeline.
I am trying to do a poc of jenkins pipeline as code. I am using the Github organization folder plugin to scan Github orgs and create jobs per branch. Is there a way to explicitly define the names for the pipeline jobs that get from Jenkinsfile? I also want to add some descriptions for the jobs.
You need to use currentBuild like below. The node part is important
node {
currentBuild.displayName = "$yournamevariable-$another"
currentBuild.description = "$yourdescriptionvariable-$another"
}
Edit: Above one renames build where as Original question is about renaming jobs.
Following script in pipeline will do that(this requires appropriate permissions)
item = Jenkins.instance.getItemByFullName("originalJobName")
item.setDescription("This description was changed by script")
item.save()
item.renameTo("newJobName")
I'm late to the party on this one, but this question forced me in the #jenkins chat where I spent most of my day today. I would like to thank #tang^ from that chat for helping solve this in a graceful way for my situation.
To set the JOB description and JOB display name for a child in a multi-branch DECLARATIVE pipeline use the following steps block in a stage:
steps {
script {
if(currentBuild.rawBuild.project.displayName != 'jobName') {
currentBuild.rawBuild.project.description = 'NEW JOB DESCRIPTION'
currentBuild.rawBuild.project.setDisplayName('NEW JOB DISPLAY NAME')
}
else {
echo 'Name change not required'
}
}
}
This will require that you approve the individual script calls through the Jenkins sandbox approval method, but it was far simpler than anything else I'd found across the web about renaming the actual children of the parent pipeline. The last thing to note is that this should work in a Jenkinsfile where you can use the environment variables to manipulate the job items being set.
I tried to used code snippet from accepted answer to describe my Jenkins pipeline in Jenkinsfile. I had to wrap code snippet into function with #NonCPS annotation and use def for item variable. I have placed code snippet in root of Jenkinsfile, not in node section.
#NonCPS
def setDescription() {
def item = Jenkins.instance.getItemByFullName(env.JOB_NAME)
item.setDescription("Some description.")
item.save()
}
setDescription()