I have a very simple script to test running inside a docker container.
The container starts and I can connect to the container.
node('docker') {
docker.image('python:3').inside() {
sh "python --version"
}
}
In the end the job fails. Any ideas what is wrong?
Update 1:
I have added the environment variable to Jenkins and now see the following. Looks like some strange variables are passed to docker.
Any idea how I can examine the command given in the sh?
[Pipeline] stage
[Pipeline] { (test)
[Pipeline] echo
I'm here
[Pipeline] sh
invalid argument "=" for "-e, --env" flag: invalid environment variable: =
See 'docker exec --help'.
process apparently never started in /var/lib/jenkins-
slave/workspace/SYSTEM/clean-artifactory#tmp/durable-4d51de81
[Pipeline] }
[Pipeline] // stage
This was a bug in the Durable Task plugin and has been fixed by the latest release (1.33).
See JENKINS-59903
I had the same problem and after a long wait this error message is logged in console:
Cannot contact : java.io.FileNotFoundException: File '/var/lib/jenkins/workspace/myproject#2#tmp/durable-1a2d497f/output.txt' does not exist
The problem is Durable Task plugin. In my case I downgraded Durable Task plugin from latest (1.31) to 1.30 and that solved the problem.
I'm using Docker Pipeline version 1.21
Related
I have installed Jenkins and Docker inside a VM. I am using Jenkins pipeline project and my jenkins declarative pipeline looks like this.
pipeline {
agent {
docker { image 'node:7-alpine' }
}
stages {
stage('Test') {
steps {
echo 'Hello Nodejs'
sh 'node --version'
}
}
}
}
It is a very basic pipeline following this link https://jenkins.io/doc/book/pipeline/docker/
When I try to build my jenkins job, it prints Hello Nodejs, but gets stuck at the next instruction i.e. execution of shell command. After 5 minutes, the job fails with this error
process apparently never started in /var/lib/jenkins/workspace/MyProject#tmp/durable-c118923c
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
ERROR: script returned exit code -2
I am not understanding why it is not executing the sh command.
If I make it as agent any, it executes the sh command.
I am not sure that it will help but I remember that node image is launched under root account by default. Jenkins uses its own ID when launching a container. So, probably, it's a permissions issue. Try to add -u 0 argument:
agent {
docker {
image 'node:7-alpine'
args '-u 0'
}
}
My pipeline has been work all good until today.
Jenkins dynamically spins up a slave container (docker cloud) where all my steps are run from. Error as below, just wondering why jenkins create a tmp dir in the workspace dir.
[Pipeline] sh
[xxx_root_proj] Running shell script
+ cd ./xxx_root_proj
/home/jenkins/workspace/xxx_root_proj#tmp/durable-b532c37c/script.sh: 3: cd: can't cd to ./xxx_root_proj
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE
Just wondering if anyone has come across this before.
I think the "/home/jenkins/workspace/xxx_root_proj#tmp" is the problem, not sure how jenkins uses this.
Thanks in advance
#tmp folder is created by Jenkins in workspace for shared library components etc. Basically a temp working dir for the pipeline
Here's a simple Jenkins pipeline job that highlights the different working directories seen by sh vs script.
pipeline {
agent any
stages {
stage('Stage1') {
steps {
sh """
pwd
"""
script {
echo "cwd--pwd: "+ "pwd".execute().text
}
}
}
}
}
Here's how the Jenkins instance was launched
/Users/MyJenkinsUser/dirJenkinsLaunched$ java -jar /Applications/Jenkins/jenkins.war --httpPort=8080
Here's the console output of the job...
Started by user MyJenkinsUser
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] node
Running on Jenkins in /Users/MyJenkinsUser/.jenkins/jobs/TestPipeline/workspace
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Stage1)
[Pipeline] sh
[workspace] Running shell script
+ pwd
/Users/MyJenkinsUser/.jenkins/jobs/TestPipeline/workspace
[Pipeline] script
[Pipeline] {
[Pipeline] echo
cwd--pwd: /Users/MyJenkinsUser/dirJenkinsLaunched
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
I find it curious that they would be different working directories, the shell command sh step uses the workspace as the working directory while the groovy script step uses the directory the Jenkins process was launched.
Question: how can I make my Jenkins scripted pipeline steps (script) use the workspace as the working directory by default?
I guess it makes sense after realizing this most clearly, that groovy is a java thing and we launch the Jenkins war file from java, and that launching imposes a certain working directory. I wonder the origins of this design for the Jenkins behavior. It made me go a bit wonky with a bunch of file not found errors as I ported some sh commands into the more substantive groovy syntax because I wanted to avoid all the double nesting escaping craziness that one can fall into in the shell, especially when spaces are invoked in paths and what not.
You shall not use execute() in Jenkins pipelines. Use the pipeline DSL's steps instead of arbitrary Groovy code.
As you noticed, this such "native" code is executed on the Jenkins master and without any relation to the current job.
Unfortunately this may not be a possible operation. I'll have to redesign the script code to explicitly use the workspace variable instead of relying on the current working directory Java uses.
Changing the current working directory in Java?
When running jobs from Jenkinsfile with Pipeline syntax and a Docker agent, the pipeline fails with "Docker: command not found." I understand this to mean that either (1) Docker is not installed; or (2) Jenkins is not pointing to the correct Docker installation path. My situation is very similar to this issue: Docker command not found in local Jenkins multi branch pipeline . Jenkins is installed on MacOS and running off of localhost:8080. Docker is also installed (v18.06.0-ce-mac70)./
That user's solution included a switch from pipeline declarative syntax to node scripted syntax. However I want to resolve the issue while retaining the declarative syntax.
Jenkinsfile
#!groovy
pipeline {
agent {
docker {
image 'node:7-alpine'
}
}
stages {
stage('Unit') {
steps {
sh 'node -v'
sh 'npm -v'
}
}
}
}
Error message
docker inspect -f . node:7-alpine
docker: command not found
docker pull node:7-alpine
docker: command not found
In Jenkins Global Tool Configuration, for Docker installations I tried both (1) install automatically (from docker.com); and (2) local installation with installation root /usr/local/.
All of the relevant plugins appears to be installed as well.
I solved this problem here: https://stackoverflow.com/a/58688536/8160903
(Add Docker's path to Homebrew Jenkins plist /usr/local/Cellar/jenkins-lts/2.176.3/homebrew.mxcl.jenkins-lts.plist)
I would check the user who is running the jenkins process and make sure they are part of the docker group.
You can try adding the full path of docker executable on your machine to Jenkins at Manage Jenkins > Global Tool Configuration.
I've seen it happen sometimes that the user which has started Jenkins doesn't have the executable's location on $PATH.
I've configured my Jenkins master to use docker and I can connect to docker, I've got a simple pipeline to test this:
node ('docker-build-slave') {
stage ('On slave') {
sh 'ls -l'
sh 'uname -a'
}
}
When I instigate a build and look at whats being written to the console, I get:
Started by user chris adkin
[Pipeline] node
Still waiting to schedule task
All nodes of label ‘docker-build-slave’ are offline
and it just hangs, I'm wondering if there is something really obvious I ave missed, do I need to create a node for my docker build slaves ?.
If I go onto the machine hosting jenkins, I can see that the build slave container have been started.
The docker-build-slave that you supply is a label filtering the available Jenkins agents (master/slaves). If you do not have this label assigned either to the master or to any of the (available) slaves, this job cannot be built. Read more about labels
To let Jenkins pipeline, use the docker global variable, e.g. as described in this example:
node {
checkout scm
/*
* In order to communicate with the MySQL server, this Pipeline explicitly
* maps the port (`3306`) to a known port on the host machine.
*/
docker.image('mysql:5').withRun('-e "MYSQL_ROOT_PASSWORD=my-secret-pw" -p 3306:3306') { c ->
/* Run some tests which require MySQL */
sh 'make check'
}
}
So after some digging around, and I am a bit shame faced to answer my own question, I came across this Jenkins article:
https://issues.jenkins-ci.org/browse/JENKINS-44859
I built my image using the Java JDK 7, the article states, and I quote from the comment added by Vinson Lee:
Jenkins 2.54+ requires Java 8.
I modified the docker file for my image to install open jdk 8 and everything is now working.
If you use node {} with a specific label, and don't have any nodes with that label set-up, the build will be stuck forever, as mentioned by StephenKings answer. You also need to make sure you have at least 2 executors set-up when using a single node (like 'master'), otherwise pipeline builds will usually be stuck, as they consist of a root build and several sub-builds for the steps.
The fix worked, this is the console output from successfully running the build job:
Started by user chris adkin
[Pipeline] node
Still waiting to schedule task
All nodes of label ‘docker-build-slave’ are offline
Running on docker-13b5a18eb067 in /home/jenkins/workspace/Pipeline With Docker Slave
[Pipeline] {
[Pipeline] stage
[Pipeline] { (On slave)
[Pipeline] sh
[Pipeline With Docker Slave] Running shell script
+ ls -l
total 0
[Pipeline] sh
[Pipeline With Docker Slave] Running shell script
+ uname -a
Linux localhost 4.9.49-moby #1 SMP Wed Sep 27 00:36:29 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS