Automatically deploying docker-compose on DigitalOcean via Github - docker

I am a newbie when it comes to docker.
I have a web app that contains 4 services. I manage to create a docker-compose for it.
I would like now to publish it.
My plan is to
upload the whole repository with the compose file and the source codes to a private repository in github.
then create a droplet in digital ocean
I would like to be able to publish the code easily through github only. that it will be automatically uploaded to the server and restart the required services.
what would be the best approach?

Yes, there is. There is an App platefrom in Digitalocean. Once you use it to deploy your docker image and whenever you update docker image via github, your site will be rebuilding (ci/cd).
I hope this can be help for you.

Related

Azure App Service Docker - Add file after deployment

I'm trying to as much of my CI/CD process automated. Here is what I've got at this point:
Azure App Service using a docker container.
Azure DevOps code repository.
Right now using Docker Hub as the repo for my docker container. Can move to Azure later.
I can push code to the repo, it builds the new image, pushes it to Docker Hub, once thats done it gets deployed to the Azure App Service just fine.
Where I'm running into issues is we have a Laravel app that is being deployed via this container. With Laravel there is an .env file that I don't want to push up to the code repository. How would one go about moving a file into the container once it's been deployed?
All I've been finding is how to do it via SSH or through the startup command, but all the examples assume the file is on the image.
Thanks for any tips/tricks/links/etc!! I've got a feeling this is one of those "ahh that was easy" things and what I'm searching just isnt the right verbiage.

Understanding docker principles

I have made a quite simple golang server and I need to deploy it to a digitalocean droplet.
I know that there can be issues with cross-building go apps in case they use cgo, so to not to think about it in future I decided to use docker, so my app will be build and run always in same environment.
The first thing I dont get is about developing an app. When I create a Dockerfile I use commands to add files from my project directory into newly created docker image. Then I run the container created from this image. But what if I edit my code? - as I understood I must stop the container, remove an image and then build it again. This is a bit tricky for such a common situation - or am I doing things wrong?
Second one - I have created a docker droplet on a DO: Whats the way to deploy my app?
I have to push my image to any docker repository and pull it on to the droplet?
Or I can upload it directly?
Or I have to scp my source code to droplet and run same process as on my local machine, building image and then runnjng a container?
But what if I edit my code? - as I understood I must stop the container, remove an image and then build it again. This is a bit tricky for such a common situation - or am I doing things wrong?
Don't delete the image just rebuild it. It will be much faster than the first initial build. Also why is it tricky? It's just one or two commands, you can create a bash or .bat script if it gets annoying.
I have created a docker droplet on a DO: Whats the way to deploy my app?
All three options are a possibility. For the second one you would have to set up your VM as a docker-hub repo which might be more than you need. Using docker hub isn't bad. You could also just build the image on your server. I recommend using docker hub for it's ease and having watchtower set up on your server to restart your web app on new image pushes.
Edit: the above advice was for a VM not a docker droplet. I'm not familiar with DO but this article should help:
https://blog.machinebox.io/deploy-machine-box-in-digital-ocean-385265fbeafd

Docker without internet

I am currently working on a project which needs to be deployed on customer infra (which is not cloud) and also it will not have internet.
We currently deploy manually our application and install dependencies using tarball, can docker help us here?
Note:
Application stack:
NodeJs
MySql
Elasticsearch
Redis
MongoDB
We will not have internet.
You can use docker load and docker save to load Docker images in TAR format or export these images. If you package your application files within these images this could be used to deliver your project to your customers.
Also note that the destination services must all have Docker Engine installed and running.
If you have control over your dev environment, you can also use Nexus or Gitlab as your private Docker repository. You can then pull your images from there into production, if it makes sense for your product.
I think the most advantage can be had in your local dev setup. Instead of installing, say, MySQL locally, you can run it as a Docker container. I use docker-compose for all client services in my current project. This helps keep your computer clean, makes it easy to avoid versioning hell (if you use different versions for each release or stage) and you don't have to mess around with configuration for each dev machine.
In my previous job every developer had a local Oracle SQL install, and that was not a happy state of affairs.

deploy an application in the jelastic marketplace

So basically, this what we want to do: create an application and publish it in our jelastic provider.
To do that, we had this idea:
Create our customized docker images locally
create a registry in our jelastic provider
push our images into this registry
create a manifest in yaml that describes how the environment should be created
Basically, the manifest describes:
what images should be used (our image and a jelastic storage)
shell scripts to run on the dockers
pops up a message when the installation succeed
send mail also
My question is: Is that secure?? I mean if a user go to the marketplace and chooses to deploy our application, can he gets some registry information?
So you have any experience in this?
Thank you in advance
Regards

DOCKER for setting up BitBucket

I want to setup BitBucket on a my server.
BitBucket needs
Web Server and
Postgresql Db server.
So can I setup a docker container with all these installed.
My goal is also to create a single script which can setup the above environment, without the need to give difficult instructions to someone and download binaries individually, setup start-up etc..
Please point me in the correct direction.
You can run both servers with official images:
Bitbucket with the image atlassian/bitbucket-server and
PostgreSQL with the image postgres.
Follow the descriptions there. Be aware that you need to use data-volumes to persist the Bitbucket repositories and the PostgreSQL database files.
To link both together, an easy way would be to use Docker compose. Or use
docker run --link postgres-container-name bitbucket.

Resources