Is there any way to make script in jenkins when this script be called or triggered jenkins automatically take last war of my project and deploy it in specific container
is differ from deploy to container which called after build as post-step
Yes, is possible, you just can cp from your build directory to your deployment directory.
In my case when I run my container I bind a volume mount with the -v parameter, when I build the jenkins job, it generate the package and I move to the remote docker volume.
Related
I'm running a centOS8 linux container and I want to continuously update a .jar file inside
that container which runs centOS8. To update that .jar I want to set up a job in Jenkins and Jenkins is spinning on the separate host (not as a docker container).
So. Basically I think I want to create a job which would ssh from Jenkins to a Docker container and would execute the changes related to a .jar file. Is this possible?
Is there a way I can copy files from docker container to jenkins workspace while tests are running i.e. not pre- or post- build
Currently docker is on a server within the organisation and when I kick off a Jenkins job (maven project), it runs tests within the above container.
During the test, there are files downloaded and I would like to be able to access those files in the jenkins workspace, during execution. So I tried the following as part of my code:
docker cp [containerName]:/home/seluser/Downloads /var/jenkins_home/jobs/[jobName]/workspace
But the files don't get copied over to the workspace. I have also tried doing this locally, i.e. getting the files copied to a directory on my laptop:
docker cp [containerName]:/home/seluser/Downloads /Users/[myUsername]/testDownloads
and it worked. Is there something I'm missing regarding how to do this for jenkins workspace?
Try adding /. as :
docker cp [containerName]:/home/seluser/Downloads/. /var/jenkins_home/jobs/[jobName]/workspace/
I have configured my Jenkins to run our Build Jobs and functional Tests in a docker container. For example, when I click on the "Build Now"-Button - Jenkins will build the Dockerfile which is in Git and run the container so the Buildsteps (Jenkinsfile) can be done in this container.
My Question is now: How can I specify a startscript for my Container in the Jenkins-Pipeline?
Thanks for any tips.
You can pass a string(i.e. the startscript name) as an input parameter in your Jenkins pipeline, so that you can conditionally specify which script to run.
Then you can mount the appropriate script at runtime in your jenkins pipeline config.
docker run -v /path/to/script:/var imageName /var/${START_SCRIPT}.sh
I have a Jenkins setup in a docker container in my local computer.
Can I move it to a company's CI server and re-use job items?
I tried this at local computer
docker commit
docker push
At CI server
docker pull
docker run
However, when I run Jenkins on CI server, Jenkins was initialized.
How can I get all the configurations and job items using Docker?
As described in the docs for the commit command
The commit operation will not include any data contained in volumes
mounted inside the container.
The jenkins home is mounted as a volume, thus when you commit the container the jenkins home won't be commited. Therefore all the job configuration that is currently on the running local container won't be part of the commited image.
Your problem reduces to how would you migrate the jenkins_home volume that is on your machine, to the another machine. This problem is solved and you can find the solution here.
I would suggest however a better and more scalable approach, specifically for jenkins. The problem with the first approach, is that there is quiet some manual intervention that needs to be done whenever you want to start a similar jenkins instance on a new machine.
The solution is as follows:
Commit the container that is currently running
Copy the job configuration that is inside the container using the command: docker cp /var/jenkins_home/jobs ./jobs. This will copy the job config from the running container into your machine. Remember to clean the build folders
Create a Dockerfile that inherits from the commited image and copy the job config under the jenkins_home.
Push the image and you should have an image that you can pull and will have all the jobs configured correctly
The dockerfile will look something like:
FROM <commited-container>
COPY jobs/* /var/jenkins_home/jobs/
You need to check how the Jenkins image (hub.docker.com/r/jenkins/jenkins/) was launched on your local computer: if it was mounting a local volume, that volume should include the JENKINS_HOME with all the job configurations and plugins.
docker run -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home jenkins/jenkins:lts
You need to export that volume too, not just the image.
See for instance "Docker & Jenkins: Data that Persists ", using a data volume container that you can then export/import.
I have some difficulties to configure Jenkins to run test on a dockerized application.
First here is my set up: the project is on bitbucket and I have a docker-compose that run my application which is composed of 3 three conmtainers for now (one for mongo, one for redis, one for my node app).
The webhook of bitbucket works well and Jenkins is triggered when I push.
However what i would like to do for a build is:
get a repo where my docker-compose is, run the docker-compose in order to have my cluster running, and then run a "npm test" inside the repo (my test use mocha), and finally having Jenkins notified if the test have passed or not.
If someone could help me to get this chain of operation applied by Jenkins, it would be awesome.
The simplest way is use jenkins pipeline plugin or shell script.
To build docker image and run compose you could use docker-compose command. Important thing is that you need rebuild docker image from compose level (because if you run docker-compose run only jenkins can use previous bilded image). So you need run docker-compose build before.
Your dockerfile should copy all files of your application.
Next when your service is ready you could run command in docker image using: docker exec {CONTAINER_ID} {COMMAND_TO_RUN_TESTS}.