I want to be able to override default values per build, via "Run with parameters".
Currently, to enable someone to override environment variables on the agent I have this in my Jenkinsfile...
pipeline {
parameters {
string(name: 'build_tsc', defaultValue: '', description: 'Override the path to the tsc executable')
}
stages {
stage('Compiling') {
steps {
script {
if (params.build_tsc) {
echo "Compiling with tsc override: ${params.build_tsc}"
bat "${params.build_tsc}"
}
else if (!env.JENKINS_TSC) {
error("tsc not set on agent")
}
else {
echo "Compiling with agent tsc: ${env.JENKINS_TSC}"
bat "${env.JENKINS_TSC}"
}
}
}
}
}
}
Is there a better way?
You can override the values in the JENKINS UI Node configuration page:
${JENKINS_URL}/computer/{{NODE}}/configure:
We do this where specific nodes have different configurations and have a groovy job that updates them based on the labels we have assigned (eg: WL1035 vs WL1036) when we add nodes or change the configuration.
You can also install the Slave Setups plugin to perform Node Configuration at launch time:
The above assumes you wish to override a global ENV or tool configuration for all jobs on the node.
Related
I need to pass environment variables to my executable and my unit tests. This works locally but not on Jenkins. On Jenkins, my environment variable gets reset in between gradle tasks
task setupEnv(type: Exec) {
commandLine 'export', "ABC=def"
}
test {
dependsOn 'setupEnv'
scanForTestClasses = false
include '**/*Test.*'
}
Note: I'm simplifying here for SO (I'm aware of the environment command in Gradle) but even with this simple example it works locally but not on Jenkins, meaning *Test.java files see nothing for System.getEnv("ABC"). I'm looking at how to not have Jenkins reset environment variables
There are two options to set dynamical env in pipeline. As a result these envs will be available globally (to all stages in the pipeline):
set it in a stage
stage('set date') {
steps {
script {
env.ANY_NAME_OF_THE_SCRIPT=sh(returnStdout: true, script: "date +%Y-%m-%d").trim()
}
}
}
set it in environment
pipeline {
agent {
label 'some_label'
}
environment {
HELLO = """${sh(
returnStdout: true,
script: 'echo hello'
).trim()}"""
}
}
I created a simple pipline in Jenkins. The remote root directory of my agent is set to my project root path. But when I test, where I am during the build (e.g. by defining a step like sh 'pwd'), I see, that the directory, my steps are executed from is the $WORKSPACE directory (/path_to_remote_root_directory_of_the_agent/workspace/jenkins_project_title). That means, I cannot just start neither my unit tests like sh 'vendor/bin/phpunit ./test/Unit', nor other tasks, that I usually run from the project root folder.
I'm pretty sure, that I simply configured something incorrectly and that in the normal case scripts like this
pipeline {
agent {
label 'devvm-slave-01'
}
stages {
stage('Prepare') {
steps {
sh 'composer install'
...
}
}
...
stage('Checkstyle') {
steps {
sh 'vendor/bin/phpcs --report=checkstyle --report-file=`pwd`/build/logs/checkstyle.xml --standard=PSR2 --extensions=php --ignore=autoload.php --ignore=vendor/ . || exit 0'
checkstyle pattern: 'build/logs/checkstyle.xml'
}
}
}
}
work as expected without any crude workarounds for paths.
What am I doing wrong and how to get it working correctly?
From the section "agent" of the "Jenkins Handbook"'s chapter "Pipeline Syntax":
Parameters
node
agent { node { label 'labelName' } } behaves the same as agent { label 'labelName' }, but node allows for additional options (such as customWorkspace).
So, the solution is the using of the node and its customWorkspace option:
pipeline {
agent {
node {
label 'devvm-slave-01'
customWorkspace '/path/to/my/project'
}
}
...
}
In the Jenkins pipeline master-slave environment having multiple slaves type, how to force a post-build task to run in the same slave & workspace where a previous stage was executed.
For example, in the following pipeline code snippet, 3 different slaves are used. The "Dynamic Server Creation" stage is executed on "miscslave" type agent where we run a "vagrant up" command. The next 2 stages are executed in different slaves, 'performance_slave and 'seleniumslave'. Now when the tests are executed against the dynamic servers, we need to destroy it by running "vagrant destroy" as a post stage task. However, it needs to be run from the same slave "miscslave" workspace where the "vagrant up" executed, as it created a ".vagrant" directory there with the dynamic server machine info, which is required to run the "destroy".
How can we force the pipeline here to execute this post build task in the same workspace where "Dynamic Server Creation" was executed?
pipeline {
agent { label 'miscslave' }
stages {
stage('Stage 1') { }
stage("DynamicServer Creation") {
agent {
label 'miscslave'
}
stages {
stage('DynamicServer Creation') {
//Create Dynamic server using vagrant up, create a .vagrant dir to save the create machine info
}
stage('DynamicServer Test') {
parallel {
stage("Execute Performance Tests") {
agent { label 'performance_slave' }
}
stage("Execute UI Tests") {
agent { label 'seleniumslave' }
}
}
}
}
post {
always {
//Delete the dynamic server using vagrant destroy. It has to be run in the same workspace where "vagrant up" was executed in "Dynamic Server creation stage"
}
}
}
}
}```
I have a project which has multiple build pipelines to allow for different types of builds against it (no, I don't have the ability to make one build out of it; that is outside my control).
Each of these pipelines is represented by a Jenkinsfile in the project repo, and each one must use the same build agent label (they need to share other pieces of configuration as well, but it's the build agent label which is the current problem). I'm trying to put the label into some sort of a configuration file in the project repo, so that all the Jenkinsfiles can read it.
I expected this to be simple, as you don't need this config data until you have already checked out a copy of the sources to read the Jenkinsfile. As far as I can tell, it is impossible.
It seems to me that a Jenkinsfile cannot read files from SCM until the project has done its SCM step. However, that's too late: the argument to agent{label} is read before any stages get run.
Here's a minimal case:
final def config
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
checkout scm // we don't need all the submodules here
echo "Reading configuration JSON"
script { config = readJSON file: 'buildjobs/buildjob-config.json' }
echo "Read configuration JSON"
}
}
stage('Build and Deploy') {
agent {
label config.agent_label
}
steps {
echo 'Got into Stage 2'
}
}
}
}
When I run this, I get:
java.lang.NullPointerException: Cannot get property 'agent_label' on null object I don't get either of the echoes from the 'Configure' stage.
If I change the label for the 'Build and Deploy' stage to 'master', the build succeeds and prints out all three echo statements.
Is there any way to read a file from the Git workspace before the agent labels need to be set?
Please see https://stackoverflow.com/a/52807254/7983309. I think you are running into this issue. label is unable to resolve config.agent_label to its updated value. Whatever is set in the first line is being sent to your second stage.
EDIT1:
env.agentName = ''
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
script {
env.agentName = 'slave'
echo env.agentName
}
}
}
stage('Finish') {
steps {
node (agentName as String) { println env.agentName }
script {
echo agentName
}
}
}
}
}
Source - In a declarative jenkins pipeline - can I set the agent label dynamically?
I want to set some jenkins environment variables in run time based on my computation. How can i set this run-time in my jenkinsfile's step section.
for example: based on my calculation i get abc=1. How can i set this in real time in my jenkinsfile's step section so that i can use it later by calling $abc.
I am declaring my pipeline and environment variables as explained here:
https://jenkins.io/doc/pipeline/tour/environment/
i'm using Jenkins ver. 2.41
Here an example how to set variables and use it in the same Jenkinsfile.
The Variable versionToDeploy will be used by the build job step.
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'build the artifacts'
script {
versionToDeploy = '2.3.0'
}
}
}
}
post {
success {
echo 'start deploy job'
build job: 'pipeline-declarative-multi-job-deploy', parameters: [[$class: 'StringParameterValue', name: 'version', value: versionToDeploy]]
}
}
}