I'm running docker from Jenkins and the console output is not reflecting the actual state of operation because docker is using interactive shell.
How can I make docker output to stdout in non-interactive mode? (similar to mvn -B flag).
Thanks!
--progress plain should do the trick.
Related
I'm a beginner in docker, as well as in team city, I set up a pipeline for a build of a docker container and wanted to configure it to run after a successful build, I tried to use a step with a docker, but they advise using the command line with executable parameter and some way with docker socket, I crossed the Internet / YouTube did not see normal examples for starting a container after a build. I saw some examples of launching with agents, but again I did not understand anything in what was written, I looked for examples on YouTube, I also did not find it. Please give an example of running docker as a step in the pipeline on Linux.
I solved my similar requirement on Jenkins by applying following..
Add a shell file (e.g. run.sh) in your project. In there have the docker run command that you will use from command line adding > /dev/null 2>&1 & at the end so that the process can be run in background and O/P streams to null.
docker run --name some-mysql -e MYSQL_ROOT_PASSWORD=my-secret-pw -d mysql:tag > /dev/null 2>&1 &
Then in your Jenkins (Teamcity) script add a sh step to run this file
steps {
dir (whatever-dir-run.sh-is-in) {
sh "JENKINS_NODE_COOKIE=dontKillMe sh run.sh"
}
}
Note: If JENKINS_NODE_COOKIE has an equivalent in Teamcity, use that.
Here's the situation:
I have a docker container (jenkins). I've mounted the sockets to my container so that I can perform docker commands inside my jenkins container.
Manually, everything works in the container. However, when Jenkins executes the job, it doesn't "wait" for the docker exec command to run to completion.
Below, is an extract from the Jenkinsfile. The short-lived printenv command runs correctly, and prints the environment variables. The next command (python) just gets run and then Jenkins moves on immediately, not waiting for completion. The Jenkins agent (slave) is running on an Ubuntu image. Running all these commands outside Jenkins work as expected.
echo "Running the app docker container in detached tty mode to keep it up"
docker run --detach --tty --name "${CONTAINER_NAME}" "${IMAGE_NAME}"
echo "Listing environment variables"
docker exec --interactive "${CONTAINER_NAME}" bash -c "printenv"
echo "Running test coverage"
docker exec --interactive "${CONTAINER_NAME}" bash -c "python -m coverage run --source . --branch -m pytest -vs"
It seems maybe related to this question.
Please can anyone explain how to get Jenkins to wait for the docker exec command to complete before proceeding to the next step.
Have considered alternatives, like the Docker Pipeline Plugin, but would much prefer to use something close to what I have above where possible.
Ok, another approach, I've tried using Docker Pipeline plugin here.
You can use docker.sock as volume mount to orchestrate containers on your host machine like this in your docker-compose.yml
volumes:
- /var/run/docker.sock:/var/run/docker.sock
Depending on your setup you might need to run
chmod 666 /var/run/docker.sock
to get going in the first place.
This works on macOS as well as Linux.
Ugh. This was down to the way that I'd set up docker support on the slave container.
I'd used socat to provide a TCP server proxy. Instead, switched that out for a plain old docker.sock volume between host & container.
volumes:
- /var/run/docker.sock:/var/run/docker.sock
The very first time, I had to also sort out a permissions issue by doing (inside the container):
rm -Rf ~/.docker
chmod 666 /var/run/docker.sock
After that, everything "just worked". Very painful experience.
I'm trying to run a command on container like this docker-compose exec xyz from .gitlab-ci.yml file.
The error, which I don't understand, reads the input device is not a TTY and then it exits out.
How can I troubleshoot this ?
TTY is effectively STDIN, you're executing a command (I'm guessing with the -it) flag that expects some input after the exec command from STDIN (Like typing a password, or executing bash commands in a running container). As it's a build pipeline it errors because you haven't provided anything. Otherwise can you please provide some more info about your input?
I am using EC2 UserData to bootstrap the instance.
TRacking log of bootstrap execution /var/log/cloud-init-output.log, I found that the script was stopped at :
+ docker-compose exec web python /var/www/flask/app/db_fixtures.py
the input device is not a TTY
It seems like this command it's running in interactive mode, but why ? and how to force noninteractive mode for this command (docker-compose exec) ?
Citing from the docker-compose exec docs:
Commands are by default allocating a TTY, so you can use a command such as docker-compose exec web sh to get an interactive prompt.
To disable this behavior, you can either the -T flag to disable pseudo-tty allocation:
docker-compose exec -T web python /var/www/flask/app/db_fixtures.py
Or set the COMPOSE_INTERACTIVE_NO_CLI environment variable to 1 before running docker-compose exec:
export COMPOSE_INTERACTIVE_NO_CLI=1
I have a Jenkins job with the following commands under "Execute shell":
ssh jenkins#172.31.12.58
pwd
I want the Jenkins server to connect via SSH to the remote server then run a command on the remote server.
Instead, Jenkins connects to the remote server, disconnects immediately, then runs the pwd command locally as can be seen in the output:
Started by user Johanan Lieberman
Building in workspace /var/lib/jenkins/jobs/Test Github build/workspace
[workspace] $ /bin/sh -xe /tmp/hudson266272646442487328.sh
+ ssh jenkins#172.31.12.58
Pseudo-terminal will not be allocated because stdin is not a terminal.
+ pwd
/var/lib/jenkins/jobs/Test Github build/workspace
Finished: SUCCESS
Edit: Any idea why the subsequent commands after the ssh command aren't run inside the SSH shell, but rather run locally instead?
If you're not running interactively, SSH does not create an interactive session (thus the "Pseudo-terminal" error message you see), so it's not quite the same as executing a sequence of commands in an interactive terminal.
To run a specific command through an SSH session, use:
ssh jenkins#YOUR_IP 'uname -a'
The remote command must be quoted properly as a single argument to the ssh command. Or use the bash here-doc syntax for a simple multi-line script:
ssh jenkins#YOUR_IP <<EOF
pwd
uname -a
EOF
I think you can use the Publish Over SSH plugin to execute commands on a slave with SSH:
If the Source files field is mandatory, maybe you can transfer a dummy file.
Update:
Another solution is to use the SSH plugin. Maybe it's a better solution compare to the other plugin :)