Simple setup one VPS with Portainer running a Container instance of my service.
GitHub repository is used to maintain my code.
Goal simplest way to automatically compile project on merge to master via GitHub Workflows.
And deploy the newest image to the VPS.
The compilation of the image can be done on the VPS itself.
Related
I work for a small startup. We have 3 environments (Production, Development, and Staging) and GitHub is used as VCS.
All env runs on EC2 with docker.
Can someone suggest me a simple CICD solution that can trigger builds automatically after certain branches are merged / manual trigger option?
Like, if anything in merged into dev-merge, build and deploy to development, and the same for staging and pushing the image to ECR and rolling out docker update.
We tried Jenkins but we felt it was over-complicated for our small-scale infra.
GitHub actions are also evaluated (self-hosted runners), but it needs YAMLs to be there in repos.
We are looking for something that can give us option to modify the pipeline or overall flow without code-hosted CICD config. (Like the way Jenkins gives option to either use Jenkins file or configure the job manually via GUI)
Any opinions about Team City?
I have a total of 8 Nodejs services in 8 different repositories on bitbucket. The services share some common code which is present in another repository called Brain. I want to create bitbucket pipelines in all of these repositories so that I can do the following things:
Build the docker image for each service and store it in Google Container Registry
Use ssh-run or a similar runner to ssh into my GCE VM and run a docker-compose pull and docker-compose up to deploy the latest versions.
Perform zero downtime update, i.e, keep the old containers running until the new containers are ready.
What would be the best way of doing this? Currently I'm facing the following problems:
When I push changes to the Brain repository, I need to build images for all the different services. Most of the time I'm pushing both to my brain repository and some other service repository. So multiple images are being built.
When I push changes to the Brain repository, as I build the different images for the services, all of them try to deploy using ssh-run. I don't know if this is sustainable and may crash my VM?
Any suggestions would be appreciated. Thanks in advance!
I have a monolithic application that I am hosting on google cloud.
I am using cloud build that builds my docker image when I push to my repository.
Other than using Kubernetes, what other options do I have to push my latest docker image to my web instances in a rolling update to not bring my website down?
I cant' seem to find any documentation other than Kubernetes related.
I believe I should be building a instance template that has my latest docker image. Not sure how to make this happen in an automated fashion.
I am trying to implement CI/CD pipeline for my Spring Boot micro service deployment. I am planned to use Jenkins and Kubernetes for Making CI/CD pipeline. And I have one SVN code repository for version control.
Nature Of Application
Nature of my application is, one microservice need to deploy for multiple tenant. Actually code is same but database configuration is different for different tenant. And I am managing the configuration using Spring cloud config server.
My Requirement
My requirement is that, when I am committing code into my SVN code repository, then Jenkins need to pull my code, build project (Maven), And need to create Docker Image for multiple tenant. And need to deploy.
Here the thing is that, commit to one code repository need to build multiple docker image from same code repo. Means one code repo - multiple docker image building process. Actually, Dockerfile containing different config for different docker image ie. for different tenant. So here my requirement is that I need to build multiple docker images for different tenant with different configuration added in Dockerfile from one code repo by using Jenkins
My Analysis
I am currently planning to do this by adding multiple Jenkins pipeline job connect to same code repo. And within Jenkins pipeline job, I can add different configuration. Because Image name for different tenant need to keepdifferent and need to push image into Dockerhub.
My Confusion
Here my confusion is that,
Can I add multiple pipeline job from same code repository using Jenkins?
If I can add multiple pipeline job from same code repo, How I can deploy image for every tenant to kubernetes ? Do I need to add jobs for deployment? Or one single job is enough to deploy?
You seem to be going about it a bit wrong.
Since your code is same for all the tenants and only difference is config, you should better create a single docker image and deploy it along with tenant specific configuration when deploying to Kubernetes.
So, your changes in your repository will trigger one Jenkins build and produce one docker image. Then you can have either multiple Jenkins jobs or multiple steps in pipeline which deploy the docker image with tenant specific config to Kubernetes.
If you don't want to heed to above, here are the answers to your questions:
You can create multiple pipelines from same repository in Jenkins. (Select New item > pipeline multiple times).
You can keep a list of tenants and just loop through OR run all deployments in parallel in a single pipeline stage.
I read this How does docker compare to openshift?
But I have a question :
This is an extremely simplified description of what usually devs do with Openshift :
Select a "pod" (let's say a JBoss/Wildfly container)
From within Openshift you point to your github repo
Openshift would clone the repo, build it and deploy it
Openshift present you with a web URL to access this repo port 8080
There's of course a lot more going on but that's as simple as it gets
Is this setup doable in my own linux box, VM or a cloud instance (Docker Container --> clone, build and deploy from git repo)? What would I need without messing too much with networking and domains etc?
from my research I see the following tools:
Kubernetes
Dokku : I see it described as "Your own Heroko"
I also keep hearing about CaaS (Containers as a Service)
I understand I would be needing another tool or process to the build (CI/CD) capability, and to triggering builds with git push.