I'm trying to pass the environment variables from environment section to agent section in Jenkinsfile please note the agent will be a docker container. Please find the code snippet below
pipeline {
environment {
DOCKER_REGISTRY = "my-docker-registry"
DOCKER_IMAGE = "my-docker-image-build"
BUILD_LABEL = "my-server-to-run-container"
}
agent {
docker {
label "${env.BUILD_LABEL}"
image "${env.DOCKER_IMAGE}"
}
}
stages {
stage('sample-print-env-variables') {
steps {
echo "This is the BUILD Label ${env.BUILD_LABEL}"
echo "This is the Docker Image ${env.DOCKER_IMAGE}"
echo "This is the Artifactory URL ${env.ARTIFACTORY_URL}"
}
}
}
}
The above snippet when tried on Jenkins instance (running version 2.319) yields below result
[Pipeline] isUnix
[Pipeline] withEnv
[Pipeline] {
[Pipeline] sh
+ docker inspect -f . null
Error: No such object: null
[Pipeline] isUnix
[Pipeline] withEnv (hide)
[Pipeline] {
[Pipeline] sh
+ docker pull null
Using default tag: latest
Based on the Jenkins-Documentation-Syntax the declarative semantics look correct also used Jenkins-Linter to confirm there are no errors with Jenkinsfile.
Any suggestions or inputs to resolve the error are greatly appreciated.
Thanks
Related
I'm trying to use "JOB_BASE_NAME" jenkins environmental variable in a parameter's path in a pipeline script that gets set will building the project.
example: string(defaultValue: "/abc/test/workspace/test_${JOB_BASE_NAME}/sample", description: 'test', name: 'HOME')
but while executing the ${JOB_BASE_NAME} is not getting replaced by the value(jenkins job name). I'm unsure if I'm setting the jenkins environmental variable in the path of the parameter correctly.
thank you!
I have replicated your use case and it works for me. This is the section of code
node {
stage ('test') {
sh "echo ${HOME}"
}
}
and this is the output - (my Job name was stackoverflow)
[Pipeline] { (hide)
[Pipeline] stage
[Pipeline] { (test)
[Pipeline] sh
+ echo /abc/test/workspace/test_stackoverflow/sample
/abc/test/workspace/test_stackoverflow/sample
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Check the picture of how I set the String parameter.
I'm trying to set up a Jenkins pipeline (using the declarative syntax) that runs unit and feature tests on two separate, on-demand AWS EC2 instances. The pipeline works perfectly when run on a single instance and without the parallel stages. As soon as I switch to parallel stages, any shell script fails with this cryptic message:
process apparently never started in
/home/admin/workspace/GSWebRuby_Test#tmp/durable-b0d8c4b4 (running
Jenkins temporarily with
-Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true
might make the problem clearer)
I've googled extensively and came across several bug reports of the Durable Task plugin that appears to be responsible for this message. I'm using the latest version of the plugin v. 1.33 and none of the problems seem to apply to my case, e.g. failures on unusual architectures or when running Docker containers. I've also down- and re-upgaded the plugin (toggling between versions 1.30 and 1.33). Also, to re-iterate, sh commands work without issue when I don't use the parallel stages.
I've created a simplified pipeline to debug the problem. Note that the shell commands are also simple, e.g. "env | sort", or "pwd".
pipeline {
agent none
environment {
DB_USER = credentials('db-user')
DB_PASS = credentials('db-pass')
}
stages {
stage('Setup'){
failFast false
parallel {
stage('foo') {
agent {
label 'jenkins-slave-ondemand'
}
steps {
echo 'In stage foo'
sh 'env|sort'
}
}
stage('bar') {
agent {
label 'jenkins-slave-ondemand'
}
steps {
echo 'In stage bar'
sh 'pwd'
}
}
}
}
}
}
This is the console output:
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] withCredentials
Masking supported pattern matches of $DB_PASS or $DB_USER
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Setup)
[Pipeline] parallel
[Pipeline] { (Branch: foo)
[Pipeline] { (Branch: bar)
[Pipeline] stage
[Pipeline] { (foo)
[Pipeline] stage
[Pipeline] { (bar)
[Pipeline] node
[Pipeline] node
Still waiting to schedule task
All nodes of label ‘jenkins-slave-ondemand’ are offline
Still waiting to schedule task
All nodes of label ‘jenkins-slave-ondemand’ are offline
Running on EC2 (Jenkins AWS EC2) - Jenkins slave (i-0982299c572100c71) in /home/admin/workspace/GSWebRuby_Test
[Pipeline] {
[Pipeline] echo
In stage foo
[Pipeline] sh
Running on EC2 (Jenkins AWS EC2) - Jenkins slave (i-092ecac8e6c257270) in /home/admin/workspace/GSWebRuby_Test
[Pipeline] {
[Pipeline] echo
In stage bar
[Pipeline] sh
process apparently never started in /home/admin/workspace/GSWebRuby_Test#tmp/durable-b0d8c4b4
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch foo
process apparently never started in /home/admin/workspace/GSWebRuby_Test#tmp/durable-b6cfcff9
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch bar
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] End of Pipeline
ERROR: script returned exit code -2
Finished: FAILURE
Am I doing something wrong in the way I've set up the pipeline? Any pointers would be greatly appreciated.
Edit:
After setting this JENKINS_JAVA_OPTIONS org.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true, I see this additional output:
In stage bar
[Pipeline] sh
nohup: failed to run command 'sh': No such file or directory
process apparently never started in /home/admin/workspace/GSWebRuby_Test#tmp/durable-099a2e56
I am trying to build a job by pipeline to my other slave in the master
the pipeline is like this
pipeline {
agent {
label "virtual"
}
stages {
stage("test one") {
steps {
echo " test test test"
}
}
stage("test two") {
steps {
echo " testttttttttt "
}
}
}
}
they syntax not getting the error but it doesn't build on my slave server,
but when I run on freestyle job by Restrict where this project can be run with that label then execute sheel by echo "test test"
it was built on my slave server,
what is wrong with my pipeline ? do I missing something?
after build
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on virtual in /home/virtual/jenkins/workspace/demoo
[Pipeline] {
[Pipeline] stage
[Pipeline] { (test one)
[Pipeline] echo
test test test
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (test two)
[Pipeline] echo
testttttttttt
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Add the path you want in the Remote root directory (yellow column) as shown below:-
The build works like you did it already. The steps will be executed on the slave. If you add something like clone a repository to your step, your workspace directory will be created.
Pipeline and Freestylejobs are working here different. The Freestylejob will make the directory in workspace as soon as it runs at the first time. The Pipelinejob will create the directory as soon as it needs this this directory.
I created a simple Pipeline:
pipeline {
agent {
label "linux"
}
stages {
stage("test one") {
steps {
sh "echo 'test test test' > text.txt"
}
}
}
}
I converted your echo to a sh command because my Slave is a linux slave. The sh step creates a text.txt file. As soon as I run this job, the directory will be created:
[<user>#<server> test-pipeline]$ pwd
/var/lib/jenkins/workspace/test-pipeline
[<user>#<server> test-pipeline]$ ls -l
total 4
-rw-r----- 1 <user> <group> 15 Oct 7 16:49 text.txt
I'm trying to set up a Jenkinsfile to run our CI pipeline. One of the steps will involve collecting files from across our directory tree and copying them into a single directory, for zipping up.
I'm attempting to do this using the Jenkins sh step and using glob patterns, but I can't seem to get this to work.
A simple example Jenkinsfile would be:
pipeline {
agent any
stages {
stage('List with Glob'){
steps{
sh 'ls **/*.xml'
}
}
}
}
I would expect that to list any .xml files in the workspace, but instead I receive:
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (List with Glob)
[Pipeline] sh
[jenkinsfile-pipeline] Running shell script
+ ls '**/*.xml'
ls: cannot access **/*.xml: No such file or directory
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE
I think i'm missing something with Groovy string interpolation, but I need some help for this specific case (running in a Jenkins pipeline via a Jenkinsfile)
Any help much appreciated!
As far as I can tell **/*.xml' isn't a valid glob pattern (see this). Instead what you have there is an ant naming pattern, which, as far as I know, isn't supported by bash (or sh). Instead what you wan't to do is to use find:
pipeline {
agent any
stages {
stage('List with find'){
steps{
sh "find . -type f -name '*.xml'"
}
}
}
}
I'm not a Jenkins guru so please be patient. :-)
I have a pipeline, something nearly as simple as this:
def hash = ''
node {
stage('Checkout') {
…
}
stage('Build') {
…
}
stage('Tests') {
…
}
}
stage('Ask deploy') {
input 'Deploy?'
}
node {
stage('Deploy') {
}
}
I want to set the value of the hash variable in the first node and read it in the next if the manual input is positive. Is this possible and safe? Is this the correct approach?
Note that there are multiple executors and manual input involved. In the Jenkins docs it is hinted for a node that:
As soon as an executor is free on a node, the steps will run.
This means that the two nodes may run in different executors, correct? Do they still share the same global variables? Thanks in advance for any clarifications!
If you have multiple slaves in Jenkins, the pipeline will be launch in one of this slaves. Every slave is different.
Every stage in you pipeline will be launch in the same slave so if you have the variable "hash" at the first line of your pipeline you wouldn't have problem to read it in all your pipeline but if you have to access to this variable value from a different build you can not access.
If you need a global variable to read it in different builds you can define a global variable using the Global Variables String Parameter Plugin
The hash variable is global and its value is available in the different executors which seems logical to me. So it looks like what I do is OK and it will work this way unless I miss something.
Here is how I've verified that (details skipped for brevity):
I've created a similar pipeline and killed the executor which ran the first node:
def gitHash;
node {
withCredentials(...) {
//Step 1:
//Check out from the SCM
stage('Prepare') {
echo "Checking out the project from source control.."
scmInfo = checkout scm
gitHash = scmInfo.GIT_COMMIT
echo "Project checked out, the GIT hash of the last commit is: ${gitHash}"
}
}
}
stage('Ask deploy') {
input 'Deploy?'
}
node {
withCredentials(...) {
stage('Deploy') {
echo "TODO, hash ${gitHash}"
}
}
}
The output from Jenkins is the following (details skipped):
Obtained Jenkinsfile from 7adc4bb98524b31de93e0c1ae16bf967ca3df47c
Running on jnlp-13775fa128a47 in /root/workspace/...
[Pipeline] {
[Pipeline] withCredentials
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Prepare)
[Pipeline] echo
Project checked out, the GIT hash of the last commit is: 7adc4bb98524b31de93e0c1ae16bf967ca3df47c
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] stage
[Pipeline] { (Ask deploy)
[Pipeline] input
Deploy?
Proceed or Abort
Approved by admin
[Pipeline] }
[Pipeline] // stage
[Pipeline] node
Running on jnlp-1383bdf520c9d in /root/workspace/...
[Pipeline] {
[Pipeline] withCredentials
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Deploy)
[Pipeline] echo
TODO, hash 7adc4bb98524b31de93e0c1ae16bf967ca3df47c
[Pipeline] End of Pipeline
Finished: SUCCESS
As seen the first node runs on executor jnlp-13775fa128a47 the second is on jnlp-1383bdf520c9d but the value of the globally scoped variable can be read there.