OpenShift Jenkins find out current namespace in pipeline - jenkins

We have a Jenkins deployed in OpenShift and we run our workers also in OpenShift. My question is: is there a way to find out the current namespace where Jenkins is running?
Our public is that we have many namespaces for different projects and in each of those namespaces a separate instance of Jenkins is running. We need to know the namespace where the Jenkins is running to load some project dependent values and settings.
A simple oc project yields just "Not currently on any project".

Related

Move Jenkins config from one Kubernetes cluster to another

I have inherited a Jenkins installation which is used by multiple remote teams, and running on an Amazon EKS cluster. For reasons that are not really relevant, I need to move this Jenkins workload to a new EKS cluster.
Deploying jenkins itself is not a major issue, I am doing so using helm. The persistence of the existing jenkins deployment is bound to an Amazon EBS volume, the persistence of the new deployment will also be. The mount point will be /var/jenkins_home
I'm trying to find a simple way of migrating everything from the current jenkins installation and configuration to the new one. This includes mainly:
Authorization Strategy (RBAC)
Jobs
Plugins
Cloud and Agent Config
I know that everything required is most likely in Jenkins Home. Could I in theory just dump out the current Jenkins Home folder and import into the new running jenkins container using kubetl cp or something like that? Is there an easier way? Is this unsafe in some way?

Jenkins Docker image building for Different Tenant from same code repository

I am trying to implement CI/CD pipeline for my Spring Boot micro service deployment. I am planned to use Jenkins and Kubernetes for Making CI/CD pipeline. And I have one SVN code repository for version control.
Nature Of Application
Nature of my application is, one microservice need to deploy for multiple tenant. Actually code is same but database configuration is different for different tenant. And I am managing the configuration using Spring cloud config server.
My Requirement
My requirement is that, when I am committing code into my SVN code repository, then Jenkins need to pull my code, build project (Maven), And need to create Docker Image for multiple tenant. And need to deploy.
Here the thing is that, commit to one code repository need to build multiple docker image from same code repo. Means one code repo - multiple docker image building process. Actually, Dockerfile containing different config for different docker image ie. for different tenant. So here my requirement is that I need to build multiple docker images for different tenant with different configuration added in Dockerfile from one code repo by using Jenkins
My Analysis
I am currently planning to do this by adding multiple Jenkins pipeline job connect to same code repo. And within Jenkins pipeline job, I can add different configuration. Because Image name for different tenant need to keepdifferent and need to push image into Dockerhub.
My Confusion
Here my confusion is that,
Can I add multiple pipeline job from same code repository using Jenkins?
If I can add multiple pipeline job from same code repo, How I can deploy image for every tenant to kubernetes ? Do I need to add jobs for deployment? Or one single job is enough to deploy?
You seem to be going about it a bit wrong.
Since your code is same for all the tenants and only difference is config, you should better create a single docker image and deploy it along with tenant specific configuration when deploying to Kubernetes.
So, your changes in your repository will trigger one Jenkins build and produce one docker image. Then you can have either multiple Jenkins jobs or multiple steps in pipeline which deploy the docker image with tenant specific config to Kubernetes.
If you don't want to heed to above, here are the answers to your questions:
You can create multiple pipelines from same repository in Jenkins. (Select New item > pipeline multiple times).
You can keep a list of tenants and just loop through OR run all deployments in parallel in a single pipeline stage.

CI/CD pipeline deployment flow for test and prod environment

I am trying to implement CI/CD pipeline for my microservice deployment creating in Spring Boot. I am trying to use my SVN repository, Kubernetes and Jenkins for implementing the pipeline. When I am exploring the deployment using Kubernetes and Jenkins, I found tutorials and many videos for deploying in both test and prod environment by creating and defining in the Jenkinsfile. And also adding the shell script in Jenkins configuration.
Confusion
Here I had felt the doubt that when we are deploying into test environment, how we can deploy the same into prod environment after the proper testing is finished? Do I need to add separate shell script for prod? Or are we deploying serially using one script for both test and prod?
It's completely up to you how you want to do this. In general, we create separate k8s clusters for prod and staging(etc). And your Jenkins needs to deploy to different cluster depending on your pipeline. If you want a true CI/CD, then one pipeline is enough - which will deploy to both the clusters (or environments).
Most of the time businesses don't want CI on production (for obvious reasons). They want manual testing on QA environments before it's deployed to prod.
As k8s is container based, deploying the same image to different envs is really easy. You just build your spring boot app once, and then deploy it to different envs as needed.
A simple pipeline:
Code pushed and build triggered.
Build with unit tests.
Generate the docker image and push to registry.
Run your kubectl / helm / etc to deploy the newly build image on
STAGING
Check if the deployment was successful
If you want to deploy the same to prod, continue the pipeline with (you can pause here for QA as well https://jenkins.io/doc/pipeline/steps/pipeline-input-step/):
Run your kubectl / helm / etc to deploy the newly build image on
PRODUCTION
Check if the deployment was successful
If your QA needs more time, then you can also create a different Jenkins job and trigger it manually (even the QA enggs can trigger this)
If you QA and PM are techies, then they can also merge branches or close PRs, which can auto trigger jenkins and run prod deployments.
EDIT (response to comment):
You are making REST calls to the k8s API. Even kubectl apply -f foo.yaml will make this rest call. It doesn't matter from where you are making this call - given that your kubectl is configured correctly and can communicate with the k8s server. You can have multiple cluster configured for kubectl and use kubectl --context <staging-cluster> apply -f foo.yaml. You can pick the context name from jenkins env variable or some other mechanism.
We're working on an open source project called Jenkins X which is a proposed sub project of the Jenkins foundation aimed at automating CI/CD on Kubernetes using Jenkins and GitOps for promotion.
When you merge a change to the master branch, Jenkins X creates a new semantically versioned distribution of your app (pom.xml, jar, docker image, helm chart). The pipeline then automates the generation of Pull Requests to promote your application through all of the Environments via GitOps.
Here's a demo of how to automate CI/CD with multiple environments on Kubernetes using GitOps for promotion between environments and Preview Environments on Pull Requests - using Spring Boot and nodejs apps (but we support many languages + frameworks).

Jenkins Pipeline using Openshift

I am trying to create a Jenkins pipeline on Openshift, that automatically runs a Jenkins service when we start the pipeline build. I referred few templates online and created a Jenkins pod and a pipeline. But whenever I try to run the pipeline, It shows build status as "not started.
Later, I created a standalone Jenkins service in Openshift, created a Jenkins file in open shift and tried executing it. I encountered authentication issues while connecting with Openshift from Jenkins.
Can anyone guide me, if I am missing something or any other working templates for a pipeline?
Thanks
It’s because of permissions
Jenkins runs with Jenkins user and openshift doesn’t know how to connect to it
Create a new service account in openshift jenkins

Continuous Deployment using Jenkins and Docker

We are building a java based high-availability service for a financial application. I am part of the team for managing continuous integration using Jenkins.
Lately we introduced continuous deployment too in the list and we opted for Docker containers.
Here is the the infrastructure:
The production cluster will have 3 RHEL machines running the following docker containers on each of them:
3 instances of Wildfly
Cassandra
Nginx
Application IDE is Netbeans and source code is in git.
Currently we are doing manual deployment on this infrastructure.
Please suggest me some tools which I use with Jenkins to complete the continuous deployment process.
You might want jenkins to trigger on each push to your jenkins repository. There are plugins that help you do that with a webhook.Gitlab-plugin is a solution similar solution exist for Github and other git solutions.
Instead of heavily relying on bash and jenkins configuration you might want to setup a jenkins pipeline with the jenkins pipeline plugin or even pipeline: multibranch plugin. With those you can automate your build in groovy code (jenkinsfile) in a repository with the possibility to add functunality with other plugins building on them.
You can then use the docker pipeline plugin to easily build docker containers, push docker images and run code inside docker containers.
I would suggest building your services inside docker so that your jenkins machine does not have all the different dependencies installed (and therefore maybe conflicting versions). Use docker containers with all the dependencies and run your build code in there with the docker pipeline plugin from groovy.
Install a registry solution to push and pull your docker images to.
Use the Pipeline: Shared Groovy Libraries to extract libraries from your jenkinsfiles so that they can be reused. Those library files should have their own repository which your jenkins knows about and keeps up to date. Possibly you can even have an entire pipeline process shared between multiple projects which simply add parameters in their jenkinsfile.
A lot of text and no examples. If you think something is interesting and you want to see some code just ask. I am currently setting all this up.

Resources