How to update Evey changes in docker container - docker

I'm new to docker container and I have taken Jenkins base image , running Jenkins inside container and exciting multiple jobs in Jenkins .
How to maintain backup or is there any way to update existing job in images. I need to push these in docker hub

Related

How to pull new docker images and restart docker containers after building docker images on gitlab?

There is an asp.net core api project, with sources in gitlab.
Created gitlab ci/cd pipeline to build docker image and put the image into gitlab docker registry
(thanks to https://medium.com/faun/building-a-docker-image-with-gitlab-ci-and-net-core-8f59681a86c4).
How to update docker containers on my production system after putting the image to gitlab docker registry?
*by update I mean:
docker-compose down && docker pull && docker-compose up
Best way to do this is to use Image puller, lot of open sources are available, or you can write your own on the Shell. There is one here. We use swarm, and we use this hook concept to be triggered from our CI-CD pipeline. Once our build stage is done, we http the hook url, and the docker pulls the updated image. One disadvantage with this is you need a daemon to watch your hook task, that it doesnt crash or go down. So my suggestion is to run this hook task as a docker container with restart-policy as RestartAlways

Build a docker image on Gitlab CI/CD with alpine

I would like to build docker-image on Gitlab CI/CD with alpine. This docker has to download a website (only index.html) as a file with a date every 1 hour.
All dates/ files should be saved in the docker volume.
How to start with this? I am new in docker.
First you need to run a docker container using any image you want (alpine in your case).
Then set everything in it that you want to run (like download website)
Then create a docker image and host it on gitlab docker registry
Then you simply have to code .gitlab-ci.yaml file. After pushing that to your repository
Then you need to schedule your pipeline as mentioned here
https://docs.gitlab.com/ee/user/project/pipelines/schedules.html

Can I move docker container that includes Jenkins setups to other server?

I have a Jenkins setup in a docker container in my local computer.
Can I move it to a company's CI server and re-use job items?
I tried this at local computer
docker commit
docker push
At CI server
docker pull
docker run
However, when I run Jenkins on CI server, Jenkins was initialized.
How can I get all the configurations and job items using Docker?
As described in the docs for the commit command
The commit operation will not include any data contained in volumes
mounted inside the container.
The jenkins home is mounted as a volume, thus when you commit the container the jenkins home won't be commited. Therefore all the job configuration that is currently on the running local container won't be part of the commited image.
Your problem reduces to how would you migrate the jenkins_home volume that is on your machine, to the another machine. This problem is solved and you can find the solution here.
I would suggest however a better and more scalable approach, specifically for jenkins. The problem with the first approach, is that there is quiet some manual intervention that needs to be done whenever you want to start a similar jenkins instance on a new machine.
The solution is as follows:
Commit the container that is currently running
Copy the job configuration that is inside the container using the command: docker cp /var/jenkins_home/jobs ./jobs. This will copy the job config from the running container into your machine. Remember to clean the build folders
Create a Dockerfile that inherits from the commited image and copy the job config under the jenkins_home.
Push the image and you should have an image that you can pull and will have all the jobs configured correctly
The dockerfile will look something like:
FROM <commited-container>
COPY jobs/* /var/jenkins_home/jobs/
You need to check how the Jenkins image (hub.docker.com/r/jenkins/jenkins/) was launched on your local computer: if it was mounting a local volume, that volume should include the JENKINS_HOME with all the job configurations and plugins.
docker run -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home jenkins/jenkins:lts
You need to export that volume too, not just the image.
See for instance "Docker & Jenkins: Data that Persists ", using a data volume container that you can then export/import.

Docker commit doesn't save the changed state of my container

I am a newbie about Docker. But I have looked many guides of that. I am configuring a container that it is running in a base image of jenkins with blue-ocean plugin. I run this one using docker run command and I configured my proxy information and added another plugin, k8s plugin through Jenkins Manage Plugin UI. Then I stop this container and I commit this container to save this state that has the k8s plugin and proxy information that I set already. But I run new docker image that I have made with docker commit command I can't see any proxy information and k8s plugin. It is same image that I started. Is there something I miss?
JENKINS_HOME is set to be a volume in the default Jenkins Docker image (which I'm assuming you're using). Volumes live outside of the Docker container layered filesystem. This means that any changes in those folders will not be persisted in subsequent image commits.

Jenkins docker plugin + commit docker slave, how to push it to a external registry. Image saved on docker host configured in cloud template

I am able to start up a jenkins docker slave. I execute some shell command on the slave, after the build completes the image gets saved and tagged with build id of the job.
However, the image is getting saved into the docker host machine (i.e) the hostname of the machine given on the cloud template (Docker URL).
I want the image to be pushed or saved on a different docker registry.
jenkins machine
docker host (hosting jenkins slaves)
docker registry
I am using machine 1 to pull image from 3. Making changes to the image, on successful build push the image to 3 not to 2.
Take a look at running a Docker Distribution private registry server so you can pull and push to it.
Once you get that up and running, you can:
docker login https://<URL_OF_PRIVATE_REGISTRY>
docker pull <URL_OF_PRIVATE_REGISTRY>/<IMAGE_NAME>:<IMAGE_TAG>
docker tag <LOCAL_IMAGE_NAME> <URL_OF_PRIVATE_REGISTRY>/<IMAGE_NAME>:<IMAGE_TAG>
docker push <URL_OF_PRIVATE_REGISTRY>/<IMAGE_NAME>:<IMAGE_TAG>

Resources