I am building Jenkins with a Dockerfile, and during the Docker build I would like to have Jenkins pre-configured with a set of jobs. I find this works well with Jobs DSL, where jobs are seeded, but I have yet to preconfigure the "Pipeline" DSL. Given the direction of Jenkins and use of Jenkisfile, Pipeline, etc, I think there must be some way to allow Jenkins to automatically run with a set of jobs that were built using the Pipeline approach
Example Pipeline:
pipeline {
agent {
label 'cft'
}
parameters {
string(name: 'StackName', defaultValue: 'cft-stack', description: 'The name to give the CFT stack.')
string(name: 'KeyName', defaultValue: 'ACCOUNT', description: 'The account key to use for encryption.')
string(name: 'VpcId', defaultValue: 'vpc-1234', description: 'The VPC to assign to the cluster resources.')
string(name: 'SubnetID', defaultValue: 'subnet-1234, subnet-6789', description: 'The subnet(s) to assign to the cluster resources.')
stages {
stage('Build') {
steps {
s3Download(file:'cft.yaml'
, bucket:'cft-resources'
, path:'cft.yaml'
, force:true)
cfnUpdate(stack:"${params.StackName}"
, file:"cft.yaml"
, params:[
"SnapshotId=${params.SnapshotId}",
"KeyName=${params.KeyName}",
"VpcId=${params.VpcId}"
]
, timeoutInMinutes: 20
)
}
}
}
post {
failure {
echo 'FAILURE'
cfnDelete(stack:"${params.StackName}")
}
}
}
Dockerfile:
COPY ./groovy/*.groovy /usr/share/jenkins/ref/init.groovy.d/
Pipeline's Groovy files differ from the Groovy code that can be executed to configure Jenkins. You can't add pipelines the way you're trying to do.
Your options include
copy the XML file for the job definition (pointing to your repo, as the pipeline should be in the Jenkinsfile in the repo)
create a job using Groovy and configure it (not really practicable IMHO)
use JobDSL (again, with XML as starting point) to specify your Jenkins jobs. An example for automatically adding this can be found in tknerr/jenkins-pipes-infra.
Related
I'm kind of new to Jenkins, I'd like to setup Jenkins trigger for the following case, successful build of either projA or projB should trigger build of projC, I'm using declarative Jenkins syntax, and projA..C are multi-branch projects.
projA --> projC
projB --> projC
I follow the example of #2 from Jenkins: Trigger Multi-branch pipeline on upstream change and setup projC to be triggered on projA (or projB), but not sure the syntax for projC to be triggered either on projA or projB.
In addition, is it possible to pass values from projA and projB to projC as part of the triggering mechanism? What's the syntax if possible.
Any help is appreciated.
This is the code:
pipeline {
agent any
parameters {
string(name: 'MY_BRANCH_NAME', defaultValue: '${env.BRANCH_NAME}', description: 'pass branch value')
string(name: 'MY_VERSION', defaultValue: '1.23', description: 'My version')
}
stages {
stage('Build in dev') {
steps {
echo 'Building dev..'
}
}
}
}
I think you need to look at this from the other way around. Don't think of C looking for A or B to complete. Think of A or B triggering C if they succeed.
In your A and B projects, if you consider the build to be successful by whatever your criteria is, use the build step to trigger C.
If you want to pass simple values to C, make C a parameterized build and pass the parameters in the build step.
post{
success{
build job: 'C', parameters: [booleanParam(name: 'bool1', value: true), string(name: 'foo', value: 'bar')], quietPeriod: 10
}
}
My job parameters defined in job-dsl.groovy are overwritten by those defined in pipeline.
I am using job-dsl-plugin and Jenkins pipeline to generate Jenkins job for each git branch. Sine my code is stored in gitLab they require gitLab integration. I am providing that using gitlab-plugin. The problem is with the 'gitLabConnection' it looks like it can be only applied from inside the Jenkins pipeline.
So if in job-dsl I would do:
branches.each { branch ->
String safeBranchName = branch.name.replaceAll('/', '-')
if (safeBranchName ==~ "^release.*")
{
return
}
def branch_folder = "${basePath}/${safeBranchName}"
folder branch_folder
pipelineJob("$branch_folder/build") {
logRotator {
numToKeep 20
}
parameters {
stringParam("BRANCH_NAME", "${safeBranchName}", "")
stringParam("PROJECT_NAME", "${basePath}", "")
{
}
And then in my Jenkins pipeline I would add the 'gitLabConnection'
node('node_A') {
properties([
gitLabConnection('gitlab.internal')
])
stage('clean up') {
deleteDir()
}
///(...)
I have to do it like:
node('node_A') {
properties([
gitLabConnection('gitlab.internal'),
parameters([
string(name: 'BRANCH_NAME', defaultValue: BRANCH_NAME, description: ''),
string(name: 'PROJECT_NAME', defaultValue: PROJECT_NAME, description: '')
])
])
stage('clean up') {
deleteDir()
}
///(...)
So that my BRANCH_NAME and PROJECT_NAME are not overwritten.
Is there another way to tackle this ?
Is it possible to append the 'gitLabConnection('gitlab.internal')' to the properties in the Jenkins pipeline ?
Unfortunately it doesn't seem like there is a way to do this yet. There's some discussion about this at https://issues.jenkins-ci.org/browse/JENKINS-43758 and I may end up opening a feature request to allow people to "append to properties"
There are 2 ways for solving this. The first one uses only Jenkins pipeline code, but if you choose this path the initial job run will most likely fail. This initial fail will happen, because at the time of first job run, the pipeline creates Jenkins job parameters. Once the parameters are created, job will work.
Option '1' - using Jenkins pipeline Only.
In 'Pipeline Syntax'/'Snippet Generator' check:
'This project is parameterised'.
Add parameter(s) you need, and hit 'Generate Pipeline Script'. In my case I get:
properties([
gitLabConnection(gitLabConnection: 'my_gitlab_connection', jobCredentialId: '', useAlternativeCredential: false),
[$class: 'JobRestrictionProperty'],
parameters([
string(defaultValue: 'test', description: 'test', name: 'test', trim: false)
]),
throttleJobProperty(categories: [], limitOneJobWithMatchingParams: false, maxConcurrentPerNode: 0, maxConcurrentTotal: 0, paramsToUseForLimit: '', throttleEnabled: false, throttleOption: 'project')
])
Option '2' - It's more complicated but, also far more powerfull. The one I finally took, because of the issues described above.
Use Jenkins job DSL plugin - https://github.com/jenkinsci/job-dsl-plugin
Gitlab plugin works quite well with it https://github.com/jenkinsci/gitlab-plugin#declarative-pipeline-jobs
I want to set some jenkins environment variables in run time based on my computation. How can i set this run-time in my jenkinsfile's step section.
for example: based on my calculation i get abc=1. How can i set this in real time in my jenkinsfile's step section so that i can use it later by calling $abc.
I am declaring my pipeline and environment variables as explained here:
https://jenkins.io/doc/pipeline/tour/environment/
i'm using Jenkins ver. 2.41
Here an example how to set variables and use it in the same Jenkinsfile.
The Variable versionToDeploy will be used by the build job step.
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'build the artifacts'
script {
versionToDeploy = '2.3.0'
}
}
}
}
post {
success {
echo 'start deploy job'
build job: 'pipeline-declarative-multi-job-deploy', parameters: [[$class: 'StringParameterValue', name: 'version', value: versionToDeploy]]
}
}
}
I have project A and project B. I would like to pass parameters (like the BranchName and ArtifactoryID) from project A to project B. Both are multi-branch pipelines using a Declarative Script Jenkinsfile.
When I use the Snippet Generator it tells me the project "is not parameterized". When looking at the config of the multi-branch pipeline, I don't see a way to parameterize it. What am I missing? (see attached)
A google result shows this, but I'm not sure how it's supposed to pass params between multi-branch pipelines: https://issues.jenkins-ci.org/browse/JENKINS-32780
I figured this out. I leveraged an answer from a comment here: Pipeline pass parameters to downstream jobs
For a detailed explanation using my example shown above, my Project A jenkinsfile would have the following before the stages:
parameters
{
string(name: 'BRANCH_PASSED_OVER', defaultValue: '${env.BRANCH_NAME}', description: 'pass branch value')
string(name: 'PERSON2', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
}
...and the following for the build step phase
stage('Build downstream')
{
steps
{
build job: 'BUILD/CMTest2/' + env.BRANCH_NAME.replaceAll("/", "%2F"), wait: false, parameters: [string(name: 'PERSON2', value: params.PERSON2), string(name: 'PASS_BRANCH_NAME', value: env.BRANCH_NAME)]
}
}
In Project B then in my jenkinsfile I could call the param like so:
stage('Collect Info')
{
steps
{
echo "Hello ${params.PERSON2}"
echo "PASS_BRANCH_NAME: ${params.PASS_BRANCH_NAME}"
}
}
How can I trigger build of another job from inside the Jenkinsfile?
I assume that this job is another repository under the same github organization, one that already has its own Jenkins file.
I also want to do this only if the branch name is master, as it doesn't make sense to trigger downstream builds of any local branches.
Update:
stage 'test-downstream'
node {
def job = build job: 'some-downtream-job-name'
}
Still, when executed I get an error
No parameterized job named some-downtream-job-name found
I am sure that this job exists in jenkins and is under the same organization folder as the current one. It is another job that has its own Jenkinsfile.
Please note that this question is specific to the GitHub Organization Plugin which auto-creates and maintains jobs for each repository and branch from your GitHub Organization.
In addition to the above mentioned answers: I wanted to start a job with a simple parameter passed to a second pipeline and found the answer on http://web.archive.org/web/20160209062101/https://dzone.com/refcardz/continuous-delivery-with-jenkins-workflow
So i used:
stage ('Starting ART job') {
build job: 'RunArtInTest', parameters: [[$class: 'StringParameterValue', name: 'systemname', value: systemname]]
}
First of all, it is a waste of an executor slot to wrap the build step in node. Your upstream executor will just be sitting idle for no reason.
Second, from a multibranch project, you can use the environment variable BRANCH_NAME to make logic conditional on the current branch.
Third, the job parameter takes an absolute or relative job name. If you give a name without any path qualification, that would refer to another job in the same folder, which in the case of a multibranch project would mean another branch of the same repository.
Thus what you meant to write is probably
if (env.BRANCH_NAME == 'master') {
build '../other-repo/master'
}
You can use the build job step from Jenkins Pipeline (Minimum Jenkins requirement: 2.130).
Here's the full API for the build step: https://jenkins.io/doc/pipeline/steps/pipeline-build-step/
How to use build:
job: Name of a downstream job to build. May be another Pipeline job, but more commonly a freestyle or other project.
Use a simple name if the job is in the same folder as this upstream Pipeline job;
You can instead use relative paths like ../sister-folder/downstream
Or you can use absolute paths like /top-level-folder/nested-folder/downstream
Trigger another job using a branch as a param
At my company many of our branches include "/". You must replace any instances of "/" with "%2F" (as it appears in the URL of the job).
In this example we're using relative paths
stage('Trigger Branch Build') {
steps {
script {
echo "Triggering job for branch ${env.BRANCH_NAME}"
BRANCH_TO_TAG=env.BRANCH_NAME.replace("/","%2F")
build job: "../my-relative-job/${BRANCH_TO_TAG}", wait: false
}
}
}
Trigger another job using build number as a param
build job: 'your-job-name',
parameters: [
string(name: 'passed_build_number_param', value: String.valueOf(BUILD_NUMBER)),
string(name: 'complex_param', value: 'prefix-' + String.valueOf(BUILD_NUMBER))
]
Trigger many jobs in parallel
Source: https://jenkins.io/blog/2017/01/19/converting-conditional-to-pipeline/
More info on Parallel here: https://jenkins.io/doc/book/pipeline/syntax/#parallel
stage ('Trigger Builds In Parallel') {
steps {
// Freestyle build trigger calls a list of jobs
// Pipeline build() step only calls one job
// To run all three jobs in parallel, we use "parallel" step
// https://jenkins.io/doc/pipeline/examples/#jobs-in-parallel
parallel (
linux: {
build job: 'full-build-linux', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
mac: {
build job: 'full-build-mac', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
windows: {
build job: 'full-build-windows', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
failFast: false)
}
}
Or alternatively:
stage('Build A and B') {
failFast true
parallel {
stage('Build A') {
steps {
build job: "/project/A/${env.BRANCH}", wait: true
}
}
stage('Build B') {
steps {
build job: "/project/B/${env.BRANCH}", wait: true
}
}
}
}
The command build in pipeline is there to trigger other jobs in jenkins.
Example on github
The job must exist in Jenkins and can be parametrized.
As for the branch, I guess you can read it from git
Use build job plugin for that task in order to trigger other jobs from jenkins file.
You can add variety of logic to your execution such as parallel ,node and agents options and steps for triggering external jobs. I gave some easy-to-read cookbook example for that.
1.example for triggering external job from jenkins file with conditional example:
if (env.BRANCH_NAME == 'master') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam1',value: 'valueOfParam1')
booleanParam(name: 'keyNameOfParam2',value:'valueOfParam2')
]
}
2.example triggering multiple jobs from jenkins file with conditionals example:
def jobs =[
'job1Title'{
if (env.BRANCH_NAME == 'master') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam1',value: 'valueNameOfParam1')
booleanParam(name: 'keyNameOfParam2',value:'valueNameOfParam2')
]
}
},
'job2Title'{
if (env.GIT_COMMIT == 'someCommitHashToPerformAdditionalTest') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam3',value: 'valueOfParam3')
booleanParam(name: 'keyNameOfParam4',value:'valueNameOfParam4')
booleanParam(name: 'keyNameOfParam5',value:'valueNameOfParam5')
]
}
}