Jenkins deploy into multiple openshift environments and ALM Maintenance - jenkins

I'm a bit newbie on Jenkins so I have this question.
I am currently working on a project about CD. We are using jenkins to build a docker image, push it to the registry and deploy into OpenShift afterwards...Although this process works like a charm there is a tricky problem i'd like to solve. There is not only 1 openshift but 3 (and increasing) environments/regions where I want to deploy this images.
This is how we are currently doing:
Setting region tokens as secret text
$region_1 token1
$region_2 token2
$region_3 token3
Then
build $docker_image
push $docker_image to registry
deploy into Region1.ip.to.openshift:port -token $region_1
deploy into Region2.ip.to.openshift:port -token $region_2
deploy into Region3.ip.to.openshift:port -token $region_3
Thus, in case we need to new any new "region" to the Jenkins Jobs, we have to edit every job manually...
Since the number of docker images and also the number of Openshift regions/enviromnets is increasing, we are looking for the way to kind of "automate" or make it easier as possible when it comes to add a new Openshift region, since ALL the jobs (old and new ones) must deploy their images into those new environment/regions...
I have been reading documentation for a while but Jenkins is so powerful and have so many features/options that somehow i get lost reading all the docs...
I dont know if doing a Pipeline process or similar would help...
Any help is welcome :)

Related

Simple CICD workflow for small-scale deployments?

I work for a small startup. We have 3 environments (Production, Development, and Staging) and GitHub is used as VCS.
All env runs on EC2 with docker.
Can someone suggest me a simple CICD solution that can trigger builds automatically after certain branches are merged / manual trigger option?
Like, if anything in merged into dev-merge, build and deploy to development, and the same for staging and pushing the image to ECR and rolling out docker update.
We tried Jenkins but we felt it was over-complicated for our small-scale infra.
GitHub actions are also evaluated (self-hosted runners), but it needs YAMLs to be there in repos.
We are looking for something that can give us option to modify the pipeline or overall flow without code-hosted CICD config. (Like the way Jenkins gives option to either use Jenkins file or configure the job manually via GUI)
Any opinions about Team City?

Auto deploy docker images on push

First, I'm noob with Continuous Deployement. I currently have a VPS running 3 docker containers (Flask, MongoDb, Nginx) that I'm pulling from DockerHub with a docker-compose. What I want to do is auto deploy those 3 containers when pushing some code in my github repo. I think It's possible with Ansible but I never used it.
Someone can explain me how to do it ?
Many thx !
Finally I will use Jenkins :)
That implies a webhook, as explained in "How to Integrate Your GitHub Repository to Your Jenkins Project" by Guy Salton
And that means your Jenkins server is accessible through an internet-facing public URL, which is not always obvious when working in a corporate environment.
GitHub Actions "Publishing Docker images" can help publishing the image to DockerHub, but you still need to listen/detect those events in order for your Jenkins to trigger job pulling said ppublished images.
For that, a regular sheduler Jenkins job using regclient/regclient can help checking the latest published SHA2 image ID has or has not changed.
See more with "Container Registry Management with Brandon Mitchell: DevOps and Docker (Ep 108)".

Best Practices for Installing Jenkins Instance / Pre-configured Jenkins from scratch

The Jenkins landscape is vast and new progress is difficult to keep track especially if you are not a regular DevOps.
I am currently in process of setting up a Jenkins CI system from scratch. I am looking for the best possible ways to get the Jenkins instance up and running. I have looked at options such as running from the JAR, setting it up a service, docker, blue ocean, etc.
I was wondering if you can please share your experience if there is a pre-configured setup or a scalable Jenkins solution already available in the market which is ready to be configured/deployed.
One of the key tenant on this Jenkins instance would be test automation guys running their Selenium tests (or I am ideally looking at Windows server installation although CentOS is an option) and would like to make it working for them as easy as possible.
I'm a Jenkins admin. In my company I've set up Jenkins on our Kubernetes cluster using the Helm chart with a custom docker image preloaded with plugins (you don't want to rely on the plugin update site during startup). All configuration is done with the Configuration as Code Plugin. We're using the Kubernetes plugin to do horizontally scaling. No builds are allowed on the build controller, everything is done within agents, which is custom docker images inspired by these images. and we don't allow no builds to run on the build controller. This works very well, and I'm very happy with the setup. There is also a Jenkins Kubernetes Operator which looks promising, but I havent tried it myself.
If you're not on Kubernetes, you can take a look at the Jenkins Evergreen project.
PS: The Blue Ocean project is dead, but the folks over at Cloudbees are currently in the process of overhauling the UX. They just released a weekly version where they got rid of all tables so the design is slowly becoming more and more responsive, and also a new set of icons is also coming up.
Maybe the nearest you can get to a pre-configurated Jenkins Instance is using the Docker Image (https://hub.docker.com/r/jenkins/jenkins). But also with the docker image, you have to selected plugins and so on. Maybe you want to raise an issue as purposal in the Jenkins Docker repository to make it possible to pre-configure Jenkins (Github Repo: https://github.com/jenkinsci/docker/issues)?

Does Jenkins (not Jenkins X) have gitops support?

I am trying to setup Kubernetes for my company. I have looked a good amount into Jenkins X and, while I really like the roadmap, I have come the realization that it is likely not mature enough for my company to use at this time. (UI in preview, flaky command line, random IP address needs and poor windows support are a few of the issues that have lead me to that conclusion.)
But I understand that the normal Jenkins is very mature and can run on Kubernetes. I also understand that it can have dynamically created build agents run in the cluster.
But I am not sure about gitops support. When I try to google it (gitops jenkins) I get a bunch of information that includes Jenkins X.
Is there an easy(ish) way for normal Jenkins to use GitOps? If so, how?
Update:
By GitOps, I mean something similar to what Jenkins X supports. (Meaning changes to the cluster stored in a Git repository. And merging causes a deployment.)
I mean something similar to what Jenkins X supports. (Meaning changes to the cluster stored in a Git repository. And merging causes a deployment.)
Yes, this is the what Jenkins (or other CICD tools) do. You can declare a deployment pipeline in a Jenkinsfile that is triggered on merge (commit to master) and have other steps for other branches (if you want).
I recommend to deploy with kubectl using kustomize and store the config files in your Git repository. You parameterize different environments e.g. staging and production with overlays. You may e.g. deploy with only 2 replicas in staging but with 6 replicas and more memory resources in production.
Using Jenkins for this, I would create a docker agent image with kubectl, so your steps can use the kubectl command line tool.
Jenkins on Kubernetes
But I understand that the normal Jenkins is very mature and can run on Kubernetes. I also understand that it can have dynamically created build agents run in the cluster.
I have not had the best experience with this. It may work - or it may not work so well. I currently host Jenkins outside the Kubernetes cluster. I think that Jenkins X together with Tekton may be an upcoming promising solution for this, but I have not tried that setup.

Continuous deployment with docker

I m actually working with a stack that allows me to make some automation in my integration / deployment system.
Actually I work like following :
I push my code to a github repository
Jenkins sniffs the repo and build the soft, launch unit testing
If unit testing (or other kind of tests, anyway), it notifies Rundeck to deploy to my servers (3 in my case) by connecting into SSH and telling : "hey guy, you have to pull from github, new soft version is available", then it restarts the the concerned service and my soft is now up to date
Okay, tell me if I m wrong, but it seems to be a good solution right ?
Then, I wanted to containerize my applications and now, I got some headaches.
First solution
In fact, I was wondering about something like :
Push to github
Jenkins tests, builds the docker image
Rundeck push to docker hub and tells the 3 servers to pull back the new image from the hub and run it through SSH
Problem : it will run in another container (multiple docker run of the same image, but with different versions :( )
Second solution
The second solution was to :
Push to github
Jenkins tests and tells rundeck that the test successes, without create a "real build" (only one for testing)
Rundeck connects to the running container through ssh and ask to pull the modifications, then it restarts the docker container
Problem : I am forced to use ssh in all my containers
I dont know how to bypass my problems, and what is the best solution...
Thanks for your help
I don't see any problem with solution 1.
1.Build production version with jenkins
2.Push it (via jenkins) to your private docker registry
3.Tell Rundeck/Ansible/Chef/Puppet ask 3 servers to pull latest image and restart container.
However, it's highly recommended to have some strategy, which considers blue-green principle and rollbacks if something is crashed.

Resources