Recover docker image after a gitlab-ci run - docker

Let's say I build a docker image and then run some CI build like this:
stages:
- create_builder_image
- test
Create Builder Image:
stage: create_builder_image
script:
- export DOCKER_BRANCH_TAG=$CI_COMMIT_REF_SLUG
# do stuff to build the image, using cache to speed it up
- docker push $GITLAB_IMAGE/builder:$DOCKER_BRANCH_TAG
Run Tests:
image: $GITLAB_IMAGE/builder:$CI_COMMIT_REF_SLUG
stage: build
script:
# build some stuff in the image
Then I want to push the resulting image, with the builded stuff inside
docker-package:
stage: package
script:
- docker commit ?
- docker push dockerhub:latest
That may not be possible at all.
Similar to In Gitlab CI/CD, how to commit and publish the docker container that is running our stages

Related

Publishing image with docker from gitlab ci

I am trying to create my war artifact with gradle and push it to my remote image repo. But the problem is it I am getting
COPY failed: stat /var/lib/docker/tmp/docker-builder756634785/build/libs/myartifact.war: no such file or directory.
So, It cannot reach to my artifact
how can I point to the correct location?
//gitlab-ci.yaml
stages:
- build
variables:
GRADLE_OPTS: "-Dorg.gradle.daemon=false"
GRADLE_OPTS: "-Dorg.gradle.caching=true"
build:
image: gradle:alpine
stage: build
script:
- ./gradlew clean build -i
docker_build:
image: docker:latest
stage: build
services:
- docker:dind
script:
- docker build --pull -t myrepo.io/myimage:latest .
- docker login myrepo.io -u username -p pass
- docker push myrepo.io/myimage:latest
You need to export your artifact that you generated in the build job and after that you will be able to download it on the docker_build job (using dependency)
In this doc you have a lot of examples about how to handle it https://docs.gitlab.com/ee/ci/yaml/#artifacts
and look at this example: https://docs.gitlab.com/ee/ci/yaml/#dependencies

Check docker run in Gitlab CICD pipeline

I'm using Gitlab CI/CD to build Docker images of our Node server.
I am wondering if there is a way to test that docker run of the image was ok.
We've had few occasions where the Docker builds but it is missing some files/env variables and it fails to start the server.
Is there any way to run the docker image and test if it is starting up correctly in the CI/CD pipeline?
Cheers.
With Gitlab you are able to use a docker-runner.
When you use the docker-runner, and not a shell runner, a docker-like
image and its services have to initiate, it should give an error if
something fails.
Chek this docs from gitlab:
This is a classic yml from that web:
default:
image:
name: ruby:2.2
entrypoint: ["/bin/bash"]
services:
- name: my-postgres:9.4
alias: db-postgres
entrypoint: ["/usr/local/bin/db-postgres"]
command: ["start"]
before_script:
- bundle install
test:
script:
- bundle exec rake spec
As you see, the test sections will be executed after building the image, so, you should not have to worry about. Gitlab should detect any errors when loading the image
If you are doing it with the shell gitlab-runner, you should call the
docker image start like this:
stages:
- dockerStartup
- build
- test
- deploy
- dockerStop
job 0:
stage: dockerStartup
script:
- docker build -t my-docker-image .
- docker run my-docker-image /script/to/run/tests
[...] //your jobs here
job 5:
stage: dockerStop
script: docker stop whatever

Gitlab CI Jib plugin build Docker image

I am using Jib to create a docker container and push it to the registry. To do that, I would like to build a Docker image that can be used for the purpose of container scanning before pushing the image to the Gitlab registry. The issue I am facing is I cannot use maven docker image for the build as it doesn't have docker agent running. I cannot use docker image as it doesn't have the maven image. Is there any way to address this without creating a custom docker image?
Here is my .gitlab-ci.yml file related to this part:
Building:
image: docker:19.03.1 # or maven:3-jdk-8
stage: build
only:
- master
script:
- echo "Building the project"
- mvn compile jib:dockerBuild
In case of docker image:
/bin/sh: eval: line 91: mvn: not found
In case of maven image:
Build to Docker daemon failed, perhaps you should make sure Docker is installed and you have correct privileges to run it
You can build jib using mvn compile jib:build and then make docker image and push to registry in next steps.
2 . Alternatively try running as docker in docker so that The gitlab runner can use Docker images to support our pipelines and use docker as image.
image: docker:latest
services:
- docker:dind
Building:
image: maven:3-jdk-8
stage: build
only:
- master
script:
- echo "Building the project"
- mvn compile jib:dockerBuild

Gitlab CI - Build Docker Image With Shared Runner (cannot connect to Docker Daemon)

I am currently using Gitlab Shared Runners to build and deploy my project (at least I'm trying too !).
I have the gitlab-ci.yml below :
image: java:8-jdk
stages:
- build
- package
before_script:
- export GRADLE_USER_HOME=`pwd`/.gradle
- docker info
cache:
paths:
- .gradle/wrapper
- .gradle/caches
build:
stage: build
script:
- ./gradlew build
artifacts:
paths:
- build/libs/*.jar
expire_in: 1 week
only:
- master
docker-build:
image: docker:stable
services:
- docker:dind
stage: package
script:
docker build -t registry.gitlab.com/my-project .
docker push registry.gitlab.com/my-project
after_script:
- echo "End CI"
First, build stage is doing great, but there is a problem with the second stage when I'm trying to build and push my docker image.
I get this log :
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
It seems that Gitlab is using a shared runner that can't build a docker image, but I don't know how I can change that. I cannot change the configuration of my runner, because I'm using shared runners. I also tried to put some tags to my second stages, in hope that a more suitable runner would have to take care of my job, but I'm still getting this error.
Thank you for your help.
I believe you need to set DOCKER_HOST to connect to the DinD running in another container:
docker-build:
image: docker:stable
services:
- docker:dind
stage: package
script:
- export DOCKER_HOST=tcp://docker:2375/
- docker build -t registry.gitlab.com/my-project .
- docker push registry.gitlab.com/my-project
If your shared runner executor is of type docker you may try this setup :
stages:
- build
- package
before_script:
- export GRADLE_USER_HOME=`pwd`/.gradle
- docker info
cache:
paths:
- .gradle/wrapper
- .gradle/caches
build:
image: java:8-jdk
stage: build
script:
- ./gradlew build
artifacts:
paths:
- build/libs/*.jar
expire_in: 1 week
only:
- master
docker-build:
stage: package
script:
docker build -t registry.gitlab.com/my-project .
docker push registry.gitlab.com/my-project
after_script:
- echo "End CI"
Even we have faced the same problem in our org. We found that there is a long standing issue with the docker in docker area for gitlab which can be tracked in these issues #3612, #2408 and #2890 as well.
We have found that in our case using docker binding was much suitable for our usecase than the docker-in-docker one. so, we used the solution in their official page.
I know this has been already answered but this may help some one who have a similar usecase :)

How to rebuild docker image on push before CI script jobs

I want to generate Dockerfile in GitLab CI script and build it. Then use this newly generated image in build jobs. How can I do this? Tried to use global before_script, but it already starts in default container. I need to do this out of any containers.
before_script is run before every job so it's not something you want. But you can have a first job to do the image build and take advantage of the fact that each job can use a different Docker image. The build of the image is covered in the manual.
Option A (uhm... sort of OK)
Have 2 runners, one with a shell executor (tagged shell) and one with a Docker executor (tagged docker). You would then have a first stage with a job dedicated to building the docker image. It would use the shell runner.
image_build:
stage: image_build
script:
- # create dockerfile
- # run docker build
- # push image to a registry
tags:
- shell
The second job would then run using the runner with docker executor and use this created image:
job_1:
stage: test
image: [image you created]
script:
- # your tasks
tags:
- docker
The problem with this is that the runner would need to be part of the docker group which has security implications.
Option B (better)
The second option would do the same but would have only one runner using Docker executor. The Docker image would be built within a running container (gitlab/dind:latest image) = "docker in docker" solution.
stages:
- image_build
- test
image_build:
stage: image_build
image: gitlab/dind:latest
script:
- # create dockerfile
- # run docker build
- # push image to a registry
job_1:
stage: test
image: [image you created]
script:
- # your tasks

Resources