I am new to Jenkins. Sorry if my question is basic.
I am trying to use Jenkins inside a container using the following command:
docker run --name jenkins --privileged -u root -d -p 8080:8080 -p 50000:50000 -v /var/run/docker.sock:/var/run/docker.sock:Z -v $(which docker):/usr/bin/docker -v jenkins_home:/var/jenkins_home jenkins/jenkins:lts
and after initializing Jenkins by its web page on localhost:8080, I installed docker pipeline and docker plugin plugins and restarted container. Then I created a multibranch pipeline and used my repositories to pull code from. The jenkins file in my repo:
pipeline {
agent {
docker {
image 'python'
}
}
stages {
stage('build') {
steps {
sh 'python --version'
}
}
}
}
However, at first, it didn't work due to some permission problems with docker. I noticed that docker socket inside the container had a uid and gid of nobody. Then I tried to change docker permission in my host machine to 666. Then, it no longer had the permission problem. But, still it doesn't run python --version command inside the container. I have also to add --entrypoint= as args inside jenkins file, -it entrypoint=/bin/bash, -u root. But none of them worked. Here are the logs of jenkins:
Started by user sobhan
07:20:29 Connecting to https://api.github.com using sobhansaf/******
Obtained Jenkinsfile from 2bd22bad5bfca8114cd9d98cc4f56b3915dd012e
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Jenkins in /var/jenkins_home/workspace/mypipe_main
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Declarative: Checkout SCM)
[Pipeline] checkout
The recommended git tool is: NONE
using credential cd26ef85-7736-498b-9629-813934e651fb
> git rev-parse --resolve-git-dir /var/jenkins_home/workspace/mypipe_main/.git # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/sobhansaf/circlecitest.git # timeout=10
Fetching without tags
Fetching upstream changes from https://github.com/sobhansaf/circlecitest.git
> git --version # timeout=10
> git --version # 'git version 2.30.2'
using GIT_ASKPASS to set credentials
> git fetch --no-tags --force --progress -- https://github.com/sobhansaf/circlecitest.git +refs/heads/main:refs/remotes/origin/main # timeout=10
Checking out Revision 2bd22bad5bfca8114cd9d98cc4f56b3915dd012e (main)
> git config core.sparsecheckout # timeout=10
> git checkout -f 2bd22bad5bfca8114cd9d98cc4f56b3915dd012e # timeout=10
Commit message: "Update Jenkinsfile"
> git rev-list --no-walk eb8007c2dd9171cf306a04b75cdc48b9cb0da32e # timeout=10
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] isUnix
[Pipeline] withEnv
[Pipeline] {
[Pipeline] sh
+ docker inspect -f . python
.
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] withDockerContainer
Jenkins seems to be running inside container ccd4d26ae72e22da420697318327033c77e8ff720034a8573a384d6ba442ca4b
but /var/jenkins_home/workspace/mypipe_main could not be found among []
but /var/jenkins_home/workspace/mypipe_main#tmp could not be found among []
$ docker run -t -d -u 0:0 -w /var/jenkins_home/workspace/mypipe_main -v /var/jenkins_home/workspace/mypipe_main:/var/jenkins_home/workspace/mypipe_main:rw,z -v /var/jenkins_home/workspace/mypipe_main#tmp:/var/jenkins_home/workspace/mypipe_main#tmp:rw,z -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** python cat
$ docker top 453918fd912510c869adad2cd3f9fb422108484c7af3ca02a91c42eea9b77b66 -eo pid,comm
[Pipeline] {
[Pipeline] stage
[Pipeline] { (build)
[Pipeline] sh
process apparently never started in /var/jenkins_home/workspace/mypipe_main#tmp/durable-ef6dc772
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
$ docker stop --time=1 453918fd912510c869adad2cd3f9fb422108484c7af3ca02a91c42eea9b77b66
$ docker rm -f 453918fd912510c869adad2cd3f9fb422108484c7af3ca02a91c42eea9b77b66
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code -2
GitHub has been notified of this commit’s build result
Finished: FAILURE
also it is worth mentioning that, after creating a container by Jenkins, when I check container, its command is always a cat command. Could you please help me realize what is wrong?
Related
I'm working with a project that uses espressif and to build it on my machine with docker way I do the following:
docker run --rm -v $PWD:/project -w /project espressif/idf:v4.2.2 idf.py build
I would like to elaborate a declarative pipeline, and I would like to execute the command equivalent to the one above. The way I implemented it based on other examples that worked, and the log result below.
I don't understand why the way to pass these 'idf.py build' arguments in the 'steps' block is not working. Does anyone have any ideas?
Reading the log and doing some google searches, I believe it's the jenkins plugin that can't handle the command because the image uses the entrypoint feature.
My pipeline:
pipeline {
agent any
environment {
PROJ_NAME = 'test'
}
stages {
stage('Checkout') {
steps {
git url: 'ssh://git#bitbucket.org/john/iot-project.git'
}
}
stage('Build') {
agent {
docker {
image 'espressif/idf:v4.2.2'
args '--rm -v $PWD:/project -w /project'
reuseNode true
}
}
steps{
sh 'idf.py build'
}
}
}
}
Error snippet:
[Pipeline] withDockerContainer
Jenkins does not seem to be running inside a container
$ docker run -t -d -u 1000:1000 --rm -v $PWD:/project -w /project -w /var/lib/jenkins/workspace/iot-project-TEST -v /var/lib/jenkins/workspace/iot-project-TEST:/var/lib/jenkins/workspace/iot-project-TEST:rw,z -v /var/lib/jenkins/workspace/iot-project-TEST#tmp:/var/lib/jenkins/workspace/iot-project-TEST#tmp:rw,z -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** espressif/idf:v4.2.2 cat
$ docker top 81920a1146eabe9bf5c08339a682d81ac23777de0421895e1184d2a8ef27fc8c -eo pid,comm
ERROR: The container started but didn't run the expected command. Please double check your ENTRYPOINT does execute the command passed as docker run argument, as required by official docker images (see https://github.com/docker-library/official-images#consistency for entrypoint consistency requirements).
Alternatively you can force image entrypoint to be disabled by adding option `--entrypoint=''`.
[Pipeline] {
[Pipeline] sh
+ idf.py build
/var/lib/jenkins/workspace/iot-project-TEST#tmp/durable-b8bf6ce0/script.sh: 1: /var/lib/jenkins/workspace/iot-project-TEST#tmp/durable-b8bf6ce0/script.sh: idf.py: not found
[Pipeline] }
$ docker stop --time=1 81920a1146eabe9bf5c08339a682d81ac23777de0421895e1184d2a8ef27fc8c
$ docker rm -f 81920a1146eabe9bf5c08339a682d81ac23777de0421895e1184d2a8ef27fc8c
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 127
Finished: FAILURE
UPDATE1:
The project build with the official espressif image works when I run the command directly, for example:
pipeline {
agent any
environment {
PROJ_NAME = 'test'
}
stages {
stage('Checkout') {
steps {
git url: 'ssh://git#bitbucket.org/john/iot-project.git'
}
}
stage('Build') {
steps{
sh 'docker run --rm -v $WORKSPACE/ESPComm:/project -w /project espressif/idf:v4.2.2 idf.py build'
}
}
}
}
UPDATE2:
Without the --entrypoint='' argument an error is always thrown, so I keep that argument. I will present the log of ls and pwd commands after running docker. Note: cat and top are jenkins' own tricks so that the commands inside the step block are executed
pipeline {
agent any
environment {
PROJ_NAME = 'test'
}
stages {
stage('Checkout') {
steps {
git url: 'ssh://git#bitbucket.org/john/iot-project.git'
}
}
stage('Build') {
agent {
docker {
image 'espressif/idf:v4.2.2'
args '''--rm -v $PWD:/project -w /project --entrypoint='' '''
reuseNode true
}
}
steps{
/*sh '''
source /opt/esp/idf/export.sh
idf.py build
'''*/
sh 'ls'
sh 'pwd'
}
}
}
}
[Pipeline] withDockerContainer
Jenkins does not seem to be running inside a container
$ docker run -t -d -u 1000:1000 --rm -v $PWD:/project -w /project --entrypoint= -w /var/lib/jenkins/workspace/iot-project-TEST -v /var/lib/jenkins/workspace/iot-project-TEST:/var/lib/jenkins/workspace/iot-project-TEST:rw,z -v /var/lib/jenkins/workspace/iot-project-TEST#tmp:/var/lib/jenkins/workspace/iot-project-TEST#tmp:rw,z -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** espressif/idf:v4.2.2 cat
$ docker top 69d5a4450c2463a6d8153582248796a15fa94ed04ef3d45c76c9a2358b8740cd -eo pid,comm
[Pipeline] {
[Pipeline] sh
+ ls
ESPComm
ESPComm#tmp
Grafana
README.md
xctu_template.xml
[Pipeline] sh
+ pwd
/var/lib/jenkins/workspace/iot-project-TEST
[Pipeline] }
$ docker stop --time=1 69d5a4450c2463a6d8153582248796a15fa94ed04ef3d45c76c9a2358b8740cd
$ docker rm -f 69d5a4450c2463a6d8153582248796a15fa94ed04ef3d45c76c9a2358b8740cd
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
UPDATE3:
Now running without entrypoint command. Check in log the error:
ERROR: The container started but didn't run the expected command. Please double check your ENTRYPOINT does execute the command passed as docker run argument, as required by official docker images (see https://github.com/docker-library/official-images#consistency for entrypoint consistency requirements).
Alternatively you can force image entrypoint to be disabled by adding option `--entrypoint=''`.
pipeline {
agent any
environment {
PROJ_NAME = 'test'
}
stages {
stage('Checkout') {
steps {
git url: 'ssh://git#bitbucket.org/john/iot-project.git'
}
}
stage('Build') {
agent {
docker {
image 'espressif/idf:v4.2.2'
args '''--rm -v $PWD:/project -w /project '''
reuseNode true
}
}
steps{
sh '''
pwd
ls
#source /opt/esp/idf/export.sh
. $IDF_PATH/export.sh
idf.py build
'''
}
}
}
}
[Pipeline] withDockerContainer
Jenkins does not seem to be running inside a container
$ docker run -t -d -u 1000:1000 --rm -v $PWD:/project -w /project -w /var/lib/jenkins/workspace/iot-project-TEST -v /var/lib/jenkins/workspace/iot-project-TEST:/var/lib/jenkins/workspace/iot-project-TEST:rw,z -v /var/lib/jenkins/workspace/iot-project-TEST#tmp:/var/lib/jenkins/workspace/iot-project-TEST#tmp:rw,z -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** espressif/idf:v4.2.2 cat
$ docker top 217167975bdf63d861215e12f5e3b2ef35d21681fe77bb7608a6c1cc0d03c237 -eo pid,comm
ERROR: The container started but didn't run the expected command. Please double check your ENTRYPOINT does execute the command passed as docker run argument, as required by official docker images (see https://github.com/docker-library/official-images#consistency for entrypoint consistency requirements).
Alternatively you can force image entrypoint to be disabled by adding option `--entrypoint=''`.
[Pipeline] {
[Pipeline] sh
+ pwd
/var/lib/jenkins/workspace/iot-project-TEST
+ ls
ESPComm
ESPComm#tmp
Grafana
README.md
xctu_template.xml
+ . /opt/esp/idf/export.sh
+ idf_export_main
+ [ -n ]
+ [ -z /opt/esp/idf ]
+ [ ! -d /opt/esp/idf ]
+ [ ! -f /opt/esp/idf/tools/idf.py ]
+ [ ! -f /opt/esp/idf/tools/idf_tools.py ]
+ export IDF_PATH=/opt/esp/idf
+ old_path=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ echo Detecting the Python interpreter
Detecting the Python interpreter
+ . /opt/esp/idf/tools/detect_python.sh
+ ESP_PYTHON=python
+ echo Checking "python" ...
Checking "python" ...
+ python -c import sys; print(sys.version_info.major)
+ [ 3 = 3 ]
+ ESP_PYTHON=python
+ break
+ python --version
Python 3.6.9
+ echo "python" has been detected
"python" has been detected
+ echo Adding ESP-IDF tools to PATH...
Adding ESP-IDF tools to PATH...
+ export IDF_TOOLS_EXPORT_CMD=/opt/esp/idf/export.sh
+ export IDF_TOOLS_INSTALL_CMD=/opt/esp/idf/install.sh
+ python /opt/esp/idf/tools/idf_tools.py export
+ idf_exports=export OPENOCD_SCRIPTS="/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/share/openocd/scripts";export IDF_PYTHON_ENV_PATH="/opt/esp/python_env/idf4.2_py3.6_env";export PATH="/opt/esp/tools/xtensa-esp32-elf/esp-2020r3-8.4.0/xtensa-esp32-elf/bin:/opt/esp/tools/xtensa-esp32s2-elf/esp-2020r3-8.4.0/xtensa-esp32s2-elf/bin:/opt/esp/tools/esp32ulp-elf/2.28.51-esp-20191205/esp32ulp-elf-binutils/bin:/opt/esp/tools/esp32s2ulp-elf/2.28.51-esp-20191205/esp32s2ulp-elf-binutils/bin:/opt/esp/tools/cmake/3.16.4/bin:/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/bin:/opt/esp/python_env/idf4.2_py3.6_env/bin:/opt/esp/idf/tools:$PATH"
+ eval export OPENOCD_SCRIPTS="/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/share/openocd/scripts";export IDF_PYTHON_ENV_PATH="/opt/esp/python_env/idf4.2_py3.6_env";export PATH="/opt/esp/tools/xtensa-esp32-elf/esp-2020r3-8.4.0/xtensa-esp32-elf/bin:/opt/esp/tools/xtensa-esp32s2-elf/esp-2020r3-8.4.0/xtensa-esp32s2-elf/bin:/opt/esp/tools/esp32ulp-elf/2.28.51-esp-20191205/esp32ulp-elf-binutils/bin:/opt/esp/tools/esp32s2ulp-elf/2.28.51-esp-20191205/esp32s2ulp-elf-binutils/bin:/opt/esp/tools/cmake/3.16.4/bin:/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/bin:/opt/esp/python_env/idf4.2_py3.6_env/bin:/opt/esp/idf/tools:$PATH"
+ export OPENOCD_SCRIPTS=/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/share/openocd/scripts
+ export IDF_PYTHON_ENV_PATH=/opt/esp/python_env/idf4.2_py3.6_env
+ export PATH=/opt/esp/tools/xtensa-esp32-elf/esp-2020r3-8.4.0/xtensa-esp32-elf/bin:/opt/esp/tools/xtensa-esp32s2-elf/esp-2020r3-8.4.0/xtensa-esp32s2-elf/bin:/opt/esp/tools/esp32ulp-elf/2.28.51-esp-20191205/esp32ulp-elf-binutils/bin:/opt/esp/tools/esp32s2ulp-elf/2.28.51-esp-20191205/esp32s2ulp-elf-binutils/bin:/opt/esp/tools/cmake/3.16.4/bin:/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/bin:/opt/esp/python_env/idf4.2_py3.6_env/bin:/opt/esp/idf/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ which python
+ echo Using Python interpreter in /opt/esp/python_env/idf4.2_py3.6_env/bin/python
Using Python interpreter in /opt/esp/python_env/idf4.2_py3.6_env/bin/python
+ echo Checking if Python packages are up to date...
Checking if Python packages are up to date...
+ python /opt/esp/idf/tools/check_python_dependencies.py
Python requirements from /opt/esp/idf/requirements.txt are satisfied.
+ IDF_ADD_PATHS_EXTRAS=/opt/esp/idf/components/esptool_py/esptool
+ IDF_ADD_PATHS_EXTRAS=/opt/esp/idf/components/esptool_py/esptool:/opt/esp/idf/components/espcoredump
+ IDF_ADD_PATHS_EXTRAS=/opt/esp/idf/components/esptool_py/esptool:/opt/esp/idf/components/espcoredump:/opt/esp/idf/components/partition_table
+ IDF_ADD_PATHS_EXTRAS=/opt/esp/idf/components/esptool_py/esptool:/opt/esp/idf/components/espcoredump:/opt/esp/idf/components/partition_table:/opt/esp/idf/components/app_update
+ export PATH=/opt/esp/idf/components/esptool_py/esptool:/opt/esp/idf/components/espcoredump:/opt/esp/idf/components/partition_table:/opt/esp/idf/components/app_update:/opt/esp/tools/xtensa-esp32-elf/esp-2020r3-8.4.0/xtensa-esp32-elf/bin:/opt/esp/tools/xtensa-esp32s2-elf/esp-2020r3-8.4.0/xtensa-esp32s2-elf/bin:/opt/esp/tools/esp32ulp-elf/2.28.51-esp-20191205/esp32ulp-elf-binutils/bin:/opt/esp/tools/esp32s2ulp-elf/2.28.51-esp-20191205/esp32s2ulp-elf-binutils/bin:/opt/esp/tools/cmake/3.16.4/bin:/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/bin:/opt/esp/python_env/idf4.2_py3.6_env/bin:/opt/esp/idf/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ [ -n ]
+ echo Updated PATH variable:
Updated PATH variable:
+ echo /opt/esp/idf/components/esptool_py/esptool:/opt/esp/idf/components/espcoredump:/opt/esp/idf/components/partition_table:/opt/esp/idf/components/app_update:/opt/esp/tools/xtensa-esp32-elf/esp-2020r3-8.4.0/xtensa-esp32-elf/bin:/opt/esp/tools/xtensa-esp32s2-elf/esp-2020r3-8.4.0/xtensa-esp32s2-elf/bin:/opt/esp/tools/esp32ulp-elf/2.28.51-esp-20191205/esp32ulp-elf-binutils/bin:/opt/esp/tools/esp32s2ulp-elf/2.28.51-esp-20191205/esp32s2ulp-elf-binutils/bin:/opt/esp/tools/cmake/3.16.4/bin:/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/bin:/opt/esp/python_env/idf4.2_py3.6_env/bin:/opt/esp/idf/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
/opt/esp/idf/components/esptool_py/esptool:/opt/esp/idf/components/espcoredump:/opt/esp/idf/components/partition_table:/opt/esp/idf/components/app_update:/opt/esp/tools/xtensa-esp32-elf/esp-2020r3-8.4.0/xtensa-esp32-elf/bin:/opt/esp/tools/xtensa-esp32s2-elf/esp-2020r3-8.4.0/xtensa-esp32s2-elf/bin:/opt/esp/tools/esp32ulp-elf/2.28.51-esp-20191205/esp32ulp-elf-binutils/bin:/opt/esp/tools/esp32s2ulp-elf/2.28.51-esp-20191205/esp32s2ulp-elf-binutils/bin:/opt/esp/tools/cmake/3.16.4/bin:/opt/esp/tools/openocd-esp32/v0.10.0-esp32-20200709/openocd-esp32/bin:/opt/esp/python_env/idf4.2_py3.6_env/bin:/opt/esp/idf/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ unset old_path
+ unset paths
+ unset path_prefix
+ unset path_entry
+ unset IDF_ADD_PATHS_EXTRAS
+ unset idf_exports
+ unset ESP_PYTHON
+ echo Done! You can now compile ESP-IDF projects.
Done! You can now compile ESP-IDF projects.
+ echo Go to the project directory and run:
Go to the project directory and run:
+ echo
+ echo idf.py build
idf.py build
+ echo
+ unset realpath_int
+ unset idf_export_main
+ idf.py build
Executing action: all (aliases: build)
CMakeLists.txt not found in project directory /var/lib/jenkins/workspace/iot-project-TEST
Your environment is not configured to handle unicode filenames outside of ASCII range. Environment variable LC_ALL is temporary set to C.UTF-8 for unicode support.
[Pipeline] }
$ docker stop --time=1 217167975bdf63d861215e12f5e3b2ef35d21681fe77bb7608a6c1cc0d03c237
$ docker rm -f 217167975bdf63d861215e12f5e3b2ef35d21681fe77bb7608a6c1cc0d03c237
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE
Example with an image I created, it uses ubuntu as a base, but has no entrypoint. I've already managed to successfully eclipse headless-build as follows. :
stage('Build') {
agent {
docker {
image 'tool/stm32-cubeide-image:1.0'
reuseNode true
}
}
steps {
sh '/opt/stm32cubeide/headless-build.sh -importAll $WORKSPACE -data $WORKSPACE -cleanBuild $DIR/$OPT_BUILD'
}
}
You can do something like the below. Before executing the build command try sourcing the /opt/esp/idf/export.sh which will set the environment so you can execute the build command.
sh'''
source /opt/esp/idf/export.sh
idf.py build
'''
Here is your full pipeline with the necessary changes.
pipeline {
agent any
environment {
PROJ_NAME = 'test'
}
stages {
stage('Checkout') {
steps {
git url: 'ssh://git#bitbucket.org/john/iot-project.git'
}
}
stage('Build') {
agent {
docker {
image 'espressif/idf:v4.2.2'
args '--rm -v $PWD:/project -w /project'
reuseNode true
}
}
steps{
sh '''
#source /opt/esp/idf/export.sh
. $IDF_PATH/export.sh
idf.py build
'''
}
}
}
}
Update
Following is the content in the entrypoint.
#!/usr/bin/env bash
set -e
. $IDF_PATH/export.sh
exec "$#"
So executing the build following ways seems to work for me.
sh'''
. $IDF_PATH/export.sh
idf.py build
'''
or
sh'''
sh /opt/esp/entrypoint.sh idf.py build
'''
Computer Mac M1
Create a Docker image from jenkins 2.355 with Docker Pipeline 1.28
When run a simple agent docker like this:
pipeline {
agent { docker { image 'node:16.13.1-alpine' } }
stages {
stage('build') {
steps {
sh 'node --version'
}
}
}
}
The agent is pull correctly but the sh command is never executed. The jod is freezed and after many seconds the job stoped .
This is the console log obtained:
Started by user Miguel Salinas Gancedo
[Pipeline] Start of Pipeline
[Pipeline] node (hide)
Running on Jenkins in /var/jenkins_home/workspace/docker-demo
[Pipeline] {
[Pipeline] isUnix
[Pipeline] withEnv
[Pipeline] {
[Pipeline] sh
+ docker inspect -f . node:16.13.1-alpine
.
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] withDockerContainer
Jenkins does not seem to be running inside a container
$ docker run -t -d -u 0:0 -w /var/jenkins_home/workspace/docker-demo -v /var/jenkins_home/workspace/docker-demo:/var/jenkins_home/workspace/docker-demo:rw,z -v /var/jenkins_home/workspace/docker-demo#tmp:/var/jenkins_home/workspace/docker-demo#tmp:rw,z -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** node:16.13.1-alpine cat
$ docker top 71f7d4d760e80490e325446e445050b46558b45e0ca4c3a99ef9ac8b65e2666d -eo pid,comm
[Pipeline] {
[Pipeline] stage
[Pipeline] { (build)
[Pipeline] sh
process apparently never started in /var/jenkins_home/workspace/docker-demo#tmp/durable-f28a3e4d
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
$ docker stop --time=1 71f7d4d760e80490e325446e445050b46558b45e0ca4c3a99ef9ac8b65e2666d
$ docker rm -f 71f7d4d760e80490e325446e445050b46558b45e0ca4c3a99ef9ac8b65e2666d
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code -2
Finished: FAILURE
What is wrong??
following the instructions here in https://www.jenkins.io/doc/book/installing/docker/ totally fixed this problem for me. Just remove everything related to your existing Jenkins image, volume etc...and start again from scratch with those instructions.
I guess the new updates have changed the way DinD works with Jenkins.
I hope it works out for you.
I have been struggling with getting the Jenkins' declared pipeline to stash the results of steps that run under a Dockerfile agent. After a variety of Dockerfile / stash configurations, it continues to fail. Hopefully, someone can identify my error.
Below is the stripped-down version of the Jenkins file, which fails in the same way as the original.
Jenkinsfile:
pipeline {
agent {
label 'Docker-enabled'
}
stages {
stage('Build') {
agent {
dockerfile {
filename 'cicd/docker/light.Dockerfile'
label 'Docker-enabled'
args '--user root'
}
}
steps {
script {
sh """
echo "hello" > /hi.txt
chmod 666 /hi.txt
chown jenkins:jenkins /hi.txt
"""
dir("/") {
stash name: "TARGET", includes: "hi.txt"
}
}
}
}
stage('Test'){
steps {
script {
echo "test"
}
}
}
}
}
And the dockerfile it references based on the dotnet/sdk:5.0-alpine base image.
FROM mcr.microsoft.com/dotnet/sdk:5.0-alpine
RUN adduser -D -g GECOS -u 1341 jenkins jenkins
Output snippet from a run, with the failure message at the bottom ERROR: No files included in stash ‘TARGET’
> /usr/local/bin/git rev-parse --resolve-git-dir /app/jenkins/workspace/DevOps-Pipeline-Demos/CDMMS-test/.git # timeout=10
> /usr/local/bin/git config remote.origin.url https://github.com/CenturyLink/CDMMS-dotnet-core # timeout=10
Fetching upstream changes from https://github.com/CenturyLink/CDMMS-dotnet-core
> /usr/local/bin/git --version # timeout=10
> git --version # 'git version 2.9.5'
using GIT_ASKPASS to set credentials GitHub Creds superseding SCMAUTO
> /usr/local/bin/git fetch --tags --progress -- https://github.com/CenturyLink/CDMMS-dotnet-core +refs/heads/*:refs/remotes/origin/* # timeout=10
> /usr/local/bin/git rev-parse refs/remotes/origin/jenkins-integration3^{commit} # timeout=10
> /usr/local/bin/git config core.sparsecheckout # timeout=10
> /usr/local/bin/git checkout -f 0a76f8dae13c65b68110443a94491f51c57998ae # timeout=10
+ docker build -t 5eed65047d1b89e50cde2aa993e9999fedc2f078 -f cicd/docker/light.Dockerfile .
Sending build context to Docker daemon 22.94MB
Step 1/2 : FROM mcr.microsoft.com/dotnet/sdk:5.0-alpine
---> ea61adf98d30
Step 2/2 : RUN adduser -D -g GECOS -u 1341 jenkins jenkins
---> Using cache
---> ba336af25d41
Successfully built ba336af25d41
Successfully tagged 5eed65047d1b89e50cde2aa993e9999fedc2f078:latest
[Pipeline] isUnix
[Pipeline] sh
+ docker inspect -f . 5eed65047d1b89e50cde2aa993e9999fedc2f078
.
[Pipeline] withDockerContainer
jenkinsndodc14-prod does not seem to be running inside a container
$ docker run -t -d -u 1341:1341 --user root -w /app/jenkins/workspace/DevOps-Pipeline-Demos/CDMMS-test -v /app/jenkins/workspace/DevOps-Pipeline-Demos/CDMMS-test:/app/jenkins/workspace/DevOps-Pipeline-Demos/CDMMS-test:rw,z -v /app/jenkins/workspace/DevOps-Pipeline-Demos/CDMMS-test#tmp:/app/jenkins/workspace/DevOps-Pipeline-Demos/CDMMS-test#tmp:rw,z -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** 5eed65047d1b89e50cde2aa993e9999fedc2f078 cat
$ docker top 56a906f34d15b532bba7682788eda7778936f344d5a4f355f1641bf5dc160ef1 -eo pid,comm
[Pipeline] {
[Pipeline] script
[Pipeline] {
[Pipeline] sh
+ echo hello
+ chmod 666 /hi.txt
+ chown jenkins:jenkins /hi.txt
+ ls -lsa /hi.txt
4 -rw-rw-rw- 1 jenkins jenkins 6 Apr 21 16:26 /hi.txt
+ cat /hi.txt
hello
[Pipeline] dir
Running in /
[Pipeline] {
[Pipeline] stash
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // script
[Pipeline] }
$ docker stop --time=1 56a906f34d15b532bba7682788eda7778936f344d5a4f355f1641bf5dc160ef1
$ docker rm -f 56a906f34d15b532bba7682788eda7778936f344d5a4f355f1641bf5dc160ef1
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test)
Stage "Test" skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: No files included in stash ‘TARGET’
Finished: FAILURE
REST API
Jenkins 2.263.2
I know it is something simple about using a Dockerfile agent, but I cannot figure out what it is, so any help would be appreciated.
Thank you in advance.
I changed the output from the root directory to a subdirectory of root in the Docker container, making it so the output is written to /out/hi.txt. Next, I added a volume mount to the Dockerfile args parameter, args '--user root -v /tmp:/out'. Finally, I modified the stash command to load the file from the /tmp directory, which is shared with the /out directory inside the container.
Once these changes were made, the stash command could find the file in the /tmp directory and save it off for later steps.
...
agent {
dockerfile {
filename 'cicd/docker/light.Dockerfile'
label 'Docker-enabled'
args '--user root -v /tmp:/out'
}
}
steps {
script {
sh """
mkdir /out
echo "hello" > /out/hi.txt
chmod 666 /out/hi.txt
chown jenkins:jenkins /out/hi.txt
"""
dir("/tmp") {
stash name: "TARGET", includes: "**"
}
}
}
}
...
I'm using Alpine docker image as a Jenkins pipeline agent but I keep getting permission denied error while running apk update or apk add package. I seeing similar error for Ubuntu images also while running apt update or apt install
Here's my Jenkinsfile:
pipeline {
agent none
stages {
stage('Initialization') {
agent any
steps {
checkout scm
}
}
stage('Git Clone') {
agent { docker { image 'alpine:3.12.0' } }
steps {
sh '''
apk update;
apk add --no-cache git;
apk add --no-cache openssh;
git -v;
'''
}
}
}
}
and here's the Jenkins output:
+ docker inspect -f . alpine:3.12.0
WARNING: Error loading config file: /root/.docker/config.json: stat /root/.docker/config.json: permission denied
.
[Pipeline] withDockerContainer
Jenkins does not seem to be running inside a container
$ docker run -t -d -u 1001:0 -w "/opt/bitnami/jenkins/jenkins_home/workspace/Deploy Glosfy Frontend" -v "/opt/bitnami/jenkins/jenkins_home/workspace/Deploy Glosfy Frontend:/opt/bitnami/jenkins/jenkins_home/workspace/Deploy Glosfy Frontend:rw,z" -v "/opt/bitnami/jenkins/jenkins_home/workspace/Deploy Glosfy Frontend#tmp:/opt/bitnami/jenkins/jenkins_home/workspace/Deploy Glosfy Frontend#tmp:rw,z" -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** alpine:3.12.0 cat
$ docker top 166c9ace17a4eb6aef0af0bbc04902ee4a358212be7f029550fb39a921e305aa -eo pid,comm
[Pipeline] {
[Pipeline] sh
+ apk update
ERROR: Unable to lock database: Permission denied
ERROR: Failed to open apk database: Permission denied
[Pipeline] }
$ docker stop --time=1 166c9ace17a4eb6aef0af0bbc04902ee4a358212be7f029550fb39a921e305aa
$ docker rm -f 166c9ace17a4eb6aef0af0bbc04902ee4a358212be7f029550fb39a921e305aa
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] End of Pipeline
ERROR: script returned exit code 99
Finished: FAILURE
Can someone help me figure out the issue?
Please modify the docker tag from Jenkins pipeline like this:
docker {
image 'alpine:3.12.0'
args '-u root:root'
}
I believe the problem is that Jenkins is running the container with a non-root user, hence the Permission denied error.
Try changing your pipeline like so:
agent {
docker {
image 'alpine:3.12.0'
args '-u root'
}
}
See this answer.
When trying to run the following declarative pipeline:
pipeline {
agent { docker 'alpine' }
stages {
stage('Test') {
steps {
sh('printenv')
}
}
}
}
I get the error:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Declarative: Agent Setup)
[Pipeline] sh
[TmpTest] Running shell script
+ docker pull alpine
Using default tag: latest
latest: Pulling from library/alpine
Digest: sha256:1072e499f3f655a032e88542330cf75b02e7bdf673278f701d7ba61629ee3ebe
Status: Image is up to date for alpine:latest
[Pipeline] }
[Pipeline] // stage
[Pipeline] sh
[TmpTest] Running shell script
+ docker inspect -f . alpine
.
[Pipeline] withDockerContainer
Jenkins does not seem to be running inside a container
$ docker run -t -d -u 107:113 -w /var/lib/jenkins/workspace/TmpTest -v /var/lib/jenkins/workspace/TmpTest:/var/lib/jenkins/workspace/TmpTest:rw,z -v /var/lib/jenkins/workspace/TmpTest#tmp:/var/lib/jenkins/workspace/TmpTest#tmp:rw,z -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** --entrypoint cat alpine
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Provision Server)
[Pipeline] sh
[TmpTest] Running shell script
sh: /var/lib/jenkins/workspace/TmpTest#tmp/durable-1abfbc69/script.sh: not found
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
$ docker stop --time=1 db551b51404ba6305f68f9086320634eeea3d515be134e5e55b51c3c9f1eb568
$ docker rm -f db551b51404ba6305f68f9086320634eeea3d515be134e5e55b51c3c9f1eb568
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 127
Finished: FAILURE
When monitoring the pipelines #tmp directory whilst its running I can see script.sh created for a short period. I am unable to tell if it has been created, or already deleted, when the pipeline tries to execute it in the running container.
some system details
Jenkins running as a single node system which has docker installed.
Jenkins v2.60.1
(all plugins fully updated)
docker --version
Docker version 17.06.0-ce, build 02c1d87
Have the same setup (single Jenkins 2.73.1 host on an EC2 instance, not inside a container, with Docker 17.09.0-ce) and the same behavior, with both declarative and scripted pipeline.
It tries to run the script on the host itself if you specify
sh 'sh ./yourscript.sh'
or
sh './yourscript.sh'
instead of sh 'script.sh'