I want to know the process of setting up docker registry on Artifactory On-Prem solution, I fail to install it even after repeated tries.
I have used the official doc for the on-prem but had no luck installing it.
So if there was a one piece document which tells the entire setup it would great.
I tried the steps on Nginx web server with Sub Domain method(with the help of Reverse proxy config generator)
P.S - I want to try this locally in LAN kind environment if not on AWS.
Related
Can anyone has an idea about how to deploy prefect UI and backed on azure web app server using docker/docker-compose?
I need to deploy prefect workflow on the azure web app service.
Thanks for your help.
The Prefect blog has a post about deploying the server to Google cloud compute, the steps should be similar.
Prefect blog: Prefect Server 101
Edit: As OP points out GCP not a docker instance. Adding links I found previously related to docker.
PrefectHQ github issue, docker compose on VM
Github users K8 helm chart configuration might help
I can’t now find the issue on PrefectHQ github that specifically covered prefect server configs when runninng on docker, its a good place to look.
I'm trying to setup the deployment of docker images to Linux server (Debian 10).
I looked over the internet to find an easy solution to deploy images from docker repository onto a server automatically.
I know that Docker Hub has webhooks.
Also, there is an option to use Kubernetes, but it seems to be a bit too much for a simple application running on one server.
What I am looking for is a way for server to detect that docker image has been updated, so that it downloads it and runs the newest version.
Currently, I have setup automatic build of docker images on Azure DevOps that are pushed to private repository on Docker Hub (I will most likely move to privately hosted Nexus repository).
I am looking for suggestions on how to do it with relatively low complexity (e.g. should I use docker-compose for it or some sort of bash script on a server).
The closest thing to what I am looking for is this solution: How to auto deploy Docker Image on own server with GitLab?
I would like to know if this is the recommended way to do or are there any other, possibly easier ways to approach it.
I found this project that looks good as a solution for my case.
https://containrrr.github.io/watchtower/
I'm using Gitlab CI, configured with a docker+machine executor, to build and test my app on spot instances.
My main app requires a few microservices to be available on production as well as in the test step. All of these microservices are built and tested in the same Gitlab CI server (each in his own pipeline). The output of all microservices are docker images that are pushed to the Gitlab Docker Registry.
The test step I'm trying to build:
Provision a spot instance (if there's no idle one), installed with the microservice
docker
Test step
2.1. Provision a spot instance (if there's no idle one), installed with app docker
2.2. Testing script
2.3. Stop the app container, release the spot instance
Stops the microservice container, release the spot instance
I've got 2.1, 2.2, 2.3 to work by following the instructions here, but I'm not sure how to achieve the rest. I can run docker-machine explicitly in the yaml, but I'd like to use gitlab's docker+machine executor as it's configured with the credentials, limitations, offpeak settings, etc.
Is this possible to with gitlab's executor? How?
What's the "correct" way to go about doing something like this? I'm sure I'm not the first one testing with microservices but I couldn't find any info of how to do so.
You are probably looking for the CI Services functionality. They have a couple of examples of how to use a service (MySQL, PostgreSQL, Redis) or if you were using another docker image, the docker service will have the same hostname as the docker image name (eg, tutum/wordpress will have a dns hostname of tutum-wordpress and tutum__wordpress, for more info, refer to the details about hostnames).
There are also details about running the postgres in the shell executor if you were so inclined and there is a presentation on Testing things with Gitlab CI and docker.
I am currently working on a project which needs to be deployed on customer infra (which is not cloud) and also it will not have internet.
We currently deploy manually our application and install dependencies using tarball, can docker help us here?
Note:
Application stack:
NodeJs
MySql
Elasticsearch
Redis
MongoDB
We will not have internet.
You can use docker load and docker save to load Docker images in TAR format or export these images. If you package your application files within these images this could be used to deliver your project to your customers.
Also note that the destination services must all have Docker Engine installed and running.
If you have control over your dev environment, you can also use Nexus or Gitlab as your private Docker repository. You can then pull your images from there into production, if it makes sense for your product.
I think the most advantage can be had in your local dev setup. Instead of installing, say, MySQL locally, you can run it as a Docker container. I use docker-compose for all client services in my current project. This helps keep your computer clean, makes it easy to avoid versioning hell (if you use different versions for each release or stage) and you don't have to mess around with configuration for each dev machine.
In my previous job every developer had a local Oracle SQL install, and that was not a happy state of affairs.
I want to setup BitBucket on a my server.
BitBucket needs
Web Server and
Postgresql Db server.
So can I setup a docker container with all these installed.
My goal is also to create a single script which can setup the above environment, without the need to give difficult instructions to someone and download binaries individually, setup start-up etc..
Please point me in the correct direction.
You can run both servers with official images:
Bitbucket with the image atlassian/bitbucket-server and
PostgreSQL with the image postgres.
Follow the descriptions there. Be aware that you need to use data-volumes to persist the Bitbucket repositories and the PostgreSQL database files.
To link both together, an easy way would be to use Docker compose. Or use
docker run --link postgres-container-name bitbucket.