Docker stack deploy not deploying the images that is being build after code changes in repo - docker

I have created jobs for build and deploy.
The jobs are running perfectly without no errors and the project is deployed successfully.
The problem is though the repository is changed the build is not showing up the changes when tested in the browser although the workspace is successfully pulling the latest code with changes.
Few tried solutions
Made the docker image build not use the cache
Made the workspace clear before starting the build
Nothing seems working. Where might have things gone wrong?
My project structure:
Project directory: Which will have all the laravel project
Dockerfile: to build the image
docker-compose.yml: To deploy the service in docker stack
I am running the service in docker stack using docker stack deploy command
I tried to delete the previously build docker stack .By naming the docker stack unique with buildID and re-create a new stack but it didn't solve either.
On trying to remove the stack and generate that again. I get these issues.
Failed to remove network 7bkf0kka11s08k3vnpfac94q8: Error response from daemon: rpc error: code = FailedPrecondition desc = network 7bkf0kka11s08k3vnpfac94q8 is in use by task lz2798yiviziufc5otzjv5y0gFailed to remove some resources from stack: smstake43
Why is the docker stack not taking the updated image looks like it's due to the digest issue. How can it be solved if it is? Though the changes are pushed to Btbucket code repo the deployment is not showing changes due to it. If I completely do all the setup then only it fetches the latest codes.
This is bash scripts for build in jenkins jobs
#!/bin/sh
docker build -t smstake:latest .
docker stack deploy -c docker-compose.yml smstake

Related

Cloud Build docker build has a different build output to local docker build?

I've run into a very strange issue where a Dockerfile is failing in one of it's steps when it's built on GCP Cloud Build.
However it builds locally just fine.
What might be the cause of the issue? Why would there by any difference?
The actual command that is failing to build is a npm build within the container.
Turned out to be a .env file that I had locally (but was not present in the repository due to a gitignore).

Web development workflow using Github and Docker

I learnt the basics of github and docker and both work well in my environment. On my server, I have project directories, each with a docker-compose.yml to run the necessary containers. These project directories also have the actual source files for that particular app which are mapped to virtual locations inside the containers upon startup.
My question is now- how to create a pro workflow to encapsulate all of this? Should the whole directory (including the docker-compose files) live on github? Thus each time changes are made I push the code to my remote, SSH to the server, pull the latest files and rebuild the container. This rebuilding of course means pulling the required images from dockerhub each time.
Should the whole directory (including the docker-compose files) live on github?
It is best practice to keep all source code including dockerfiles, configuration ... versioned. Thus you should put all the source code, dockerfile, and dockercompose in a git reporitory. This is very common for projects on github that have a docker image.
Thus each time changes are made I push the code to my remote, SSH to the server, pull the latest files and rebuild the container
Ideally this process should be encapsulated in a CI workflow using a tool like Jenkins. You basically push the code to the git repository,
which triggers a jenkins job that compiles the code, builds the image and pushes the image to a docker registry.
This rebuilding of course means pulling the required images from dockerhub each time.
Docker is smart enough to cache the base images that have been previously pulled. Thus it will only pull the base images once on the first build.

How can I structure my docker projects for easy deployment?

Right now I have multiple components of my application in the same folder linked together with a docker-compose
This works really well in development, but when I want to push to production it's kind of fuzzy. If I keep this structure I cannot use only dockerhub to host my images because the docker-compose which links them will be missing. If I use git to pull down my docker-compose, what would be the point of dockerhub? Why not just clone my whole repo and run docker-compose up each time?
I could alternatively store each component separately in separate github repos, pushing them up to dockerhub when pushed to master. Then, simply combine them from the hub with a dockercompose. This seems less than ideal too, since one would have to clone and push to several different repos to make a change which effects the system.
How do you do it?
I have two parts: source code and config files (docker files, docker-compose files...)
I put Dockerfile and docker-compose in a folder with the struct like you and push it to a git repository. For source code (and other data), I have to manage it by hand, with separated git repositories for source code to push and pull each time it needs to update.
Be careful with the production server, just update small part instead of the whole server.
Check out the new (still experimental) docker-app (June 2018)
It will allow you to push your docker-compose to DockerHub, as well as launch your app (through docker-compose) with settings variations between dev and prod.
See example:
You can create an Application Package based on this Compose file:
$ docker-app init --single-file hello
$ ls
docker-compose.yml
hello.dockerapp
The new new file hello.dockerapp contains three YAML documents:
metadatas
the Compose file
settings for your application
See "Sharing your application on the Hub"
You can push any application to the Hub using docker-app push:
$ docker-app push --namespace myHubUser --tag latest
This command will create an image named myHubUser/hello.dockerapp:latest on your local Docker daemon, and push it to the Hub.

Docker Hub - Automated Build failed, but local build without problems

I have created the repository ldaume/docker-highcharts-server in the Docker Hub Registry which is connected to a github repository which contains the Dockerfile.
If I build the image locally it works like a charm.
But the automated build fails with the error Unknown Build Error. and no logs. The only content I can see in the build informations is the Dockerfile, so docker had no problems with github ;).
Any ideas?
In my scenario I had a file in my current directory that was not checked in, but in my Dockerfile I had it COPYing it into the container. Have you looked into that possibility?

Docker command to fetch dockerfile from registry

I'm new to docker and I wonder why there is no command to fetch AUTOMATED BUILD-repo's Dockerfile to build image locally from it (can be convenient some times I guess, instead of opening browser, peeking for github reference on repo's page and then using git to clone)
I have created dockerfileview to fetch Dockerfile from Docker Hub.
https://github.com/remore/dockerfileview
The automated build normally has a githubrepo behind it and links to the original repository in the build details section under the Source Repository heading. Which automated build are you looking for the source file for?
If you would like to search for images from the command line you can run docker search TERM to find images (but not their docker files). You can also use docker history to give a rough approximation of the commands that went in the docker file.
e.g.

Resources