Jenkins Job-DSL job to pass parameters to jenkins-pipeline while job creation - jenkins

I have a CI Seed Job DSL Job in jenkins which creates a jenkins pipeline job.Is there any possibilities to get parameters in CI Seed Job and pass the input parameters given in Seed job to the jenkins pipeline script.
Here in below example, how the schema name given as parameter can be updated in jenkins pipeline job when creating it
CI Seed Job Setup:
1. Configured one string parameter `SchemaName`
2. DSL Script for jenkins pipeline job creation
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo $SchemaName
}
}
stage('Stage 2') {
steps {
echo 'logic'
}
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}

Related

How to invoke the job from one slave(server) to another slave(server) in jenkins

Please, can you advise as I am planning to invoke a job that is on different server. For example:
Slave1 server1: deploy job
Slave2 server2: build job
I want build job trigger the deploy job. Any suggestion, please
If you are trying to run two Jobs in the same Jenkins server. Your Pipeline should look something like below. Here from build Job you can call the Deploy Job.
Build Job
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
build job: 'DeployJobName'
}
}
}
}
Deploy Job
pipeline {
agent { label 'server1' }
stages {
stage('Deploy') {
steps {
// Deploy something
}
}
}
}
Update
If you want to trigger a Job in a different Jenkins server, you can use a Plugin like RemoteTriggerPlugin or simply using the Jenkins API.
curl http://serer1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
sh 'curl http://server1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN'
}
}
}
}

How to run Jenkins scripted pipeline job on free slave?

I want to run Jenkins scripted pipeline job at the specified slave at the moment when no other job is running on it.
After my job will be started, no other jobs should be performed on this slave, they will have to wait for my job ending running
All the tutorials that I found allowed me to run job on the node when it is free but did not protect me from launching other jobs on this node
Could you tell me how can I do this?
Because Pipeline has two Syntax, there are two ways to achieve that. For scripted pipeline please check the second one.
Declarative
pipeline {
agent none
stages {
stage('Build') {
agent { label 'slave-node​' }
steps {
echo 'Building..'
sh '''
'''
}
}
}
post {
success {
echo 'This will run only if successful'
}
}
}
Scripted
node('your-node') {
try {
stage 'Build'
node('build-run-on-this-node') {
sh ""
}
} catch(Exception e) {
throw e
}
}

Create new Jenkins jobs using Pipeline Job and Groovy script

I have Jenkins pipeline Job with parameters (name, group, taskNumber)
I need to write pipeline script which will call groovy script (this one?: https://github.com/peterjenkins1/jenkins-scripts/blob/master/add-job.groovy)
I want to create new job (with name name_group_taskNamber) every times when I build main Pipeline Job.
I don't understand:
Where do I need to put may groovy script ?
How does Pipeline script should look like? :
node{
stage('Build'){
def pipeline = load "CreateJob.groovy"
pipeline.run()
}
}
You can use and configure a shared library like here (a git repo): https://github.com/lvthillo/shared-library . You need to configure this in your Jenkins global configuration.
It contains a folder vars/. Here you can manage pipelines and groovy scripts like my slackNotifier.groovy. The script is just a groovy script to print the build result in Slack.
In the jenkins pipeline job we will import our shared library:
#Library('name-of-shared-pipeline-library')_
mavenPipeline {
//define parameters
}
In the case above also the pipeline is in the shared library but this isn't necessary.
You can just write your pipeline in the job itself and call only the function from the pipeline like this:
This is the script in the shared library:
// vars/sayHello.groovy
def call(String name = 'human') {
echo "Hello, ${name}."
}
And in your pipeline:
final Lib= library('my-shared-library')
...
stage('stage name'){
echo "output"
Lib.sayHello.groovy('Peter')
}
...
EDIT:
In new declarative pipelines you can use:
pipeline {
agent { node { label 'xxx' } }
options {
buildDiscarder(logRotator(numToKeepStr: '3', artifactNumToKeepStr: '1'))
}
stages {
stage('test') {
steps {
sh 'echo "execute say hello script:"'
sayHello("Peter")
}
}
}
post {
always {
cleanWs()
}
}
}
def sayHello(String name = 'human') {
echo "Hello, ${name}."
}
output:
[test] Running shell script
+ echo 'execute say hello script:'
execute say hello script:
[Pipeline] echo
Hello, Peter.
[Pipeline] }
[Pipeline] // stage
We do it by using the https://wiki.jenkins.io/display/JENKINS/Jobcopy+Builder+plugin, try build another step in pipeline script and pass the parms which are to be considered

Create a Build Pipeline View based on Pipeline job using Job DSL plugin in Jenkins

Since there's a limitation in Jenkins Pipeline's that you cannot add a manual build step without hanging the build (see for example this stackoverflow question) I'm experimenting with a combination of Jenkins Pipeline and Build Pipeline Plugin using the Job DSL plugin.
My plan was to create a Job DSL script that first executes the the Jenkins Pipeline (defined in a Jenkinsfile) and then create a downstream job that deploys to production (this is the manual step). I've created this Job DSL script as a test:
pipelineJob("${REPO_NAME} jobs") {
logRotator(-1, 10)
def repo = "https://path-to-repo/${REPO_NAME}.git"
triggers {
scm('* * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
publishers {
buildPipelineTrigger("${REPO_NAME} deploy to prod") {
parameters {
currentBuild()
}
}
}
}
freeStyleJob("${REPO_NAME} deploy to prod") {
}
buildPipelineView("$REPO_NAME Build Pipeline") {
selectedJob("${REPO_NAME} jobs")
}
where REPO_NAME is defined as an environment variable. The Jenkinsfile looks like this:
node {
stage('build'){
echo "building"
}
stage('run tests'){
echo "running tests"
}
stage('package Docker'){
echo "packaging"
}
stage('Deploy to Test'){
echo "Deploying to Test"
}
}
The problem is that the selectedJob points to "${REPO_NAME} jobs" which doesn't seem to be a valid option as "Initial Job" in the Build Pipeline Plugin view (you can't select it manually either).
Is there a workaround for this? I.e. how can I use a Jenkins Pipeline as the "Initial Job" for the Build Pipeline Plugin?
From the documentation on yourDomain.com/plugin/job-dsl/api-viewer/index.html#method/javaposse.jobdsl.dsl.views.NestedViewsContext.envDashboardView
It shows that buildPipelineView can only be used within a View block which is inside of a Folder block.
Folder {
View {
buildPipelineView {
}
}
}

Limiting Jenkins pipeline to running only on specific nodes

I'm building jobs that will be using Jenkins piplines extensively. Our nodes are designated per project by their tags, but unlike regular jobs the pipeline build does not seem to have the "Restrict where this project can be run" checkbox. How can I specify on which node the pipeline will run the way I do for regular jobs?
You specify the desired node or tag when you do the node step:
node('specialSlave') {
// Will run on the slave with name or tag specialSlave
}
See https://jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#node-allocate-node for an extended explanation of the arguments to node.
Edit 2019: This answer (and question) was made back in 2017, back then there only was one flavor of Jenkins pipeline, scripted pipeline, since then declarative pipeline has been added. So above answer is true for scripted pipeline, for answers regarding declarative pipeline please see other answers below.
Choosing a node which has the label X in a declarative pipeline with json format:
pipeline {
agent { label 'X' }
...
...
}
You also can apply multiple labels with or (||) or with and (&&) operator.
Running the job on any of the nodes which has label X or label Y:
agent { label 'X || Y' }
Running the job only on nodes which have both label:
agent { label 'X && Y' }
More in the Jenkins Pipeline reference guide.
ps: if you are reading this you probably have just started using Jenkins pipeline and you are not sure if you should use declarative or scripted pipeline. Short answer: it's better to start with declarative. From jenkins.io:
Declarative and Scripted Pipelines are constructed fundamentally
differently. Declarative Pipeline is a more recent feature of Jenkins
Pipeline which:
provides richer syntactical features over Scripted Pipeline syntax, and
is designed to make writing and reading Pipeline code easier.
To be clear, because Pipeline has two Syntax, there are two ways to achieve that.
Declarative
pipeline {
agent none
stages {
stage('Build') {
agent { label 'slave-node​' }
steps {
echo 'Building..'
sh '''
'''
}
}
}
post {
success {
echo 'This will run only if successful'
}
}
}
Scripted
node('your-node') {
try {
stage 'Build'
node('build-run-on-this-node') {
sh ""
}
} catch(Exception e) {
throw e
}
}
Agent or Node where we should not execute the jenkins job :
This is the negation of the problem statment i.e. node where not to run
It was most weird solution to me but issue has been already raised to jenkins community
agent { label '!build-agent-name' }
If you need to run entire jenkins pipeline to run on single node, use the following format
pipeline {
agent {
label 'test1'
}
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
if you need to execute each stage in different nodes, use the below format,
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
node("test1"){
echo 'Testing..'
}
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}

Resources