Multiple Jenkinsfiles, One Agent Label - jenkins

I have a project which has multiple build pipelines to allow for different types of builds against it (no, I don't have the ability to make one build out of it; that is outside my control).
Each of these pipelines is represented by a Jenkinsfile in the project repo, and each one must use the same build agent label (they need to share other pieces of configuration as well, but it's the build agent label which is the current problem). I'm trying to put the label into some sort of a configuration file in the project repo, so that all the Jenkinsfiles can read it.
I expected this to be simple, as you don't need this config data until you have already checked out a copy of the sources to read the Jenkinsfile. As far as I can tell, it is impossible.
It seems to me that a Jenkinsfile cannot read files from SCM until the project has done its SCM step. However, that's too late: the argument to agent{label} is read before any stages get run.
Here's a minimal case:
final def config
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
checkout scm // we don't need all the submodules here
echo "Reading configuration JSON"
script { config = readJSON file: 'buildjobs/buildjob-config.json' }
echo "Read configuration JSON"
}
}
stage('Build and Deploy') {
agent {
label config.agent_label
}
steps {
echo 'Got into Stage 2'
}
}
}
}
When I run this, I get:
java.lang.NullPointerException: Cannot get property 'agent_label' on null object I don't get either of the echoes from the 'Configure' stage.
If I change the label for the 'Build and Deploy' stage to 'master', the build succeeds and prints out all three echo statements.
Is there any way to read a file from the Git workspace before the agent labels need to be set?

Please see https://stackoverflow.com/a/52807254/7983309. I think you are running into this issue. label is unable to resolve config.agent_label to its updated value. Whatever is set in the first line is being sent to your second stage.
EDIT1:
env.agentName = ''
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
script {
env.agentName = 'slave'
echo env.agentName
}
}
}
stage('Finish') {
steps {
node (agentName as String) { println env.agentName }
script {
echo agentName
}
}
}
}
}
Source - In a declarative jenkins pipeline - can I set the agent label dynamically?

Related

How to return to inital agent of a Jenkinx matrix after a stage executed on another agent?

Is it possible in a Matrix-cell of a Jenkins declarative pipeline to execute a later stage on the pipeline's 'initial/main' agent, after a previous stage of the same cell has been executed on another agent (identified by label)?
To put this into context, I want to build native-binaries for different platforms in a Jenkins declarative pipeline using a matrix stage where each cell is responsible to
collect the native's sources for that platform
build the native-binaries from the sources for that platform
collect the just build native-binaries and distribute them to the platform specific artefacts
Step two has to be performed on special agents, which are prepared to build the binaries for a particular platform and are identified by their label. Step one and three has to be performed on the initial agent, the pipeline's 'main' agent where the sources are checkout from SCM. In the end the native-binaries are bundled together and distributed from the pipeline's inital/main agent. To transfer of sources and binaries stash/unstash is used.
A exemplary, simplified pseudo pipeline would look like:
pipeline {
agent { label 'basic' }
// Declarative SCM checkout configured in the mutli-branch pipeline job-config
stages {
stage('Build binaries') {
matrix {
axes {
axis {
name 'PLATFORM'
values 'linux', 'windows'
}
}
stages {
stage("Collect sources") {
steps {
<Collect native's sources for ${PLATFORM}> in "${WORKSPACE}/native.sources.${PLATFORM}"
dir("native.sources.${PLATFORM}") {
stash "sources.${PLATFORM}"
}
}
}
stage('Build binaries') {
options { skipDefaultCheckout() }
agent { label "natives-${PLATFORM}" }
steps {
unstash "sources.${PLATFORM}"
<Build native binaries from unstashed sources into 'libs' folder >
dir('libs') {
stash "binaries.${PLATFORM}"
}
}
}
stage('Collect and distribute binaries') {
agent {
<initial/pipeline-agent>
}
steps {
dir("libs.${PLATFORM}") {
unstash "binaries.${PLATFORM}"
}
}
}
}
}
}
stage('Bundle and distribute') {
...
}
}
}
But the question is, how do I tell Jenkins to execute the third stage of the matrix on the initial/pipeline agent again?
If I simply don't specify an agent for the third-stage the execution is:
Stage on Pipeline-Agent
Stage on Native-Build-Agent
Stage on Native-Build-Agent
but I want:
Stage on Pipeline-Agent
Stage on Native-Build-Agent
Stage on Pipeline-Agent
In the syntax-reference I didn't find a agent parameter like agent { <initial/pipeline-agent> }:
https://www.jenkins.io/doc/book/pipeline/syntax/#agent
https://www.jenkins.io/doc/book/pipeline/syntax/#matrix-cell-directives
The agent section describes a boolean option reuseNode, but it is only "valid for docker and dockerfile".
The only workaround I found so far, was to define a second matrix and move the execution of the third stage to that. This works as expect and the stage is executed on the pipeline-agent, but has the drawback that the matrix-stage has to be specified twice as well as its when-conditions.
Appendix
The problem probably also exists, when using per stage agents in an ordinary linear pipeline.

passing Jenkins env variables between stages on different agents

I've looked at this Pass Artifact or String to upstream job in Jenkins Pipeline and this Pass variables between Jenkins stages and this How do I pass variables between stages in a declarative Jenkins pipeline?, but none of these questions seem to deal with my specific problem.
Basically I have a pipeline consisting of multiple stages, each run in its own agent.
In the first stage I run a shell script. Here two variables are generated. I would like to use these variables in the next stage. The methods I've seen so far seem to only work when passing variables within the same agent.
pipeline {
stages {
stage("stage 1") {
agent {
docker {
image 'my_image:latest'
}
}
steps {
sh ("""
export VAR1=foo
export VAR2=bar
""")
}
}
stage("stage 2") {
agent {
docker {
image 'my_other_image:latest'
}
}
steps {
sh ("echo "$VAR1 $VAR2")
//expecting to see "foo bar" printed here
}
}

How to make Jenkins execute pipeline steps from the remote root directory?

I created a simple pipline in Jenkins. The remote root directory of my agent is set to my project root path. But when I test, where I am during the build (e.g. by defining a step like sh 'pwd'), I see, that the directory, my steps are executed from is the $WORKSPACE directory (/path_to_remote_root_directory_of_the_agent/workspace/jenkins_project_title). That means, I cannot just start neither my unit tests like sh 'vendor/bin/phpunit ./test/Unit', nor other tasks, that I usually run from the project root folder.
I'm pretty sure, that I simply configured something incorrectly and that in the normal case scripts like this
pipeline {
agent {
label 'devvm-slave-01'
}
stages {
stage('Prepare') {
steps {
sh 'composer install'
...
}
}
...
stage('Checkstyle') {
steps {
sh 'vendor/bin/phpcs --report=checkstyle --report-file=`pwd`/build/logs/checkstyle.xml --standard=PSR2 --extensions=php --ignore=autoload.php --ignore=vendor/ . || exit 0'
checkstyle pattern: 'build/logs/checkstyle.xml'
}
}
}
}
work as expected without any crude workarounds for paths.
What am I doing wrong and how to get it working correctly?
From the section "agent" of the "Jenkins Handbook"'s chapter "Pipeline Syntax":
Parameters
node
agent { node { label 'labelName' } } behaves the same as agent { label 'labelName' }, but node allows for additional options (such as customWorkspace).
So, the solution is the using of the node and its customWorkspace option:
pipeline {
agent {
node {
label 'devvm-slave-01'
customWorkspace '/path/to/my/project'
}
}
...
}

Create a Build Pipeline View based on Pipeline job using Job DSL plugin in Jenkins

Since there's a limitation in Jenkins Pipeline's that you cannot add a manual build step without hanging the build (see for example this stackoverflow question) I'm experimenting with a combination of Jenkins Pipeline and Build Pipeline Plugin using the Job DSL plugin.
My plan was to create a Job DSL script that first executes the the Jenkins Pipeline (defined in a Jenkinsfile) and then create a downstream job that deploys to production (this is the manual step). I've created this Job DSL script as a test:
pipelineJob("${REPO_NAME} jobs") {
logRotator(-1, 10)
def repo = "https://path-to-repo/${REPO_NAME}.git"
triggers {
scm('* * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
publishers {
buildPipelineTrigger("${REPO_NAME} deploy to prod") {
parameters {
currentBuild()
}
}
}
}
freeStyleJob("${REPO_NAME} deploy to prod") {
}
buildPipelineView("$REPO_NAME Build Pipeline") {
selectedJob("${REPO_NAME} jobs")
}
where REPO_NAME is defined as an environment variable. The Jenkinsfile looks like this:
node {
stage('build'){
echo "building"
}
stage('run tests'){
echo "running tests"
}
stage('package Docker'){
echo "packaging"
}
stage('Deploy to Test'){
echo "Deploying to Test"
}
}
The problem is that the selectedJob points to "${REPO_NAME} jobs" which doesn't seem to be a valid option as "Initial Job" in the Build Pipeline Plugin view (you can't select it manually either).
Is there a workaround for this? I.e. how can I use a Jenkins Pipeline as the "Initial Job" for the Build Pipeline Plugin?
From the documentation on yourDomain.com/plugin/job-dsl/api-viewer/index.html#method/javaposse.jobdsl.dsl.views.NestedViewsContext.envDashboardView
It shows that buildPipelineView can only be used within a View block which is inside of a Folder block.
Folder {
View {
buildPipelineView {
}
}
}

Can I "import" the stages in a Jenkins Declarative pipeline

I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

Resources