CI/CD in gitlab through mirror repository - devops

I have been wanting to set up a CI/CD pipeline for a repository in its mirror repository. I have found articles on how to mirror a repo but I cannot find how to set up CI/CD in it and view the results in original repo. Can anyone suggest me how to do it?

how to set up CI/CD
It's the same approach as if without mirroring - you just add .gitlab-ci.yml file to your original git repository, it gets mirrored to GitLab which then runs pipelines for you
how to ... view the results in original repo
You didn't specify where you host your original repo.
If it's a GitHub, then you will see GitLab pipelines on your GitHub Pull Requests.
It's the same as with any other CI tool integrated with GitHub.
Links:
https://docs.gitlab.com/ee/ci/yaml/
https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/
https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/github_integration.html

Related

How to check whether a GitHub repository uses Continuous Integration, and which CI platform (e.g. Jenkins, Travis CI)?

Is there a simple way to check whether a repository uses continuous integration, and which CI platform (e.g. Jenkins, Travis CI)?
Example: OpenCV. See https://github.com/opencv/opencv. By skimming through the repo, I have no idea whether CI is used (although I supposed so), and what kind of CI it uses.
Most CI platforms use a configuration file or directory placed at the root of the repository. It probably the quickest way to identify the tool used by each repo. Here are a few examples:
.travis.yml for Travis
.gitlab-ci.yml for Gitlab
.drone.yml for Drone CI
.circleci/ for CircleCI
.github/workflows for Github
Jenkins situation is slightly more complex since users can have a Jenkinsfile directly in there repository or not.

How to checkout Artifactory repo in jenkins workspace

I am new to artifactory just wanted to know if there is a way to checkout artifactory repo similar to git repository so that the content of the artifactory repo is visible in the Jenkins workspace.
For git repo we get options like checkout as a subfolder I want to know if we can do something similar to that for artifactory in Jenkins
Any suggestions
Git is the VCS and Artifactory is a repository manager and I doubt that this cannot be done. Artifactory in Jenkins is used for retrieving/downloading the packages, resolving the dependencies and deploying the build to Artifactory. If my understanding in incorrect, kindly elaborate on the use-case for better. Refer to the below wikis for better understanding,
https://www.jfrog.com/confluence/display/JFROG/Jenkins+Artifactory+Plug-in
https://www.jfrog.com/confluence/display/JFROG/Declarative+Pipeline+Syntax
https://www.jfrog.com/confluence/display/JFROG/Scripted+Pipeline+Syntax
https://github.com/jfrog/project-examples/tree/master/jenkins-examples/pipeline-examples/declarative-examples

Cloud build CI/CD & k8s files

I am using cloud build and GKE k8s cluster and i have setup CI/CD from github to cloud build.
I want to know is it good to add CI build file and Dockerfile in the repository or manage config file separately in another repository?
Is it good to add Ci & k8s config files with business logic repository?
What is best way to implement CI/CD cloud build to GKE with managing CI/k8 yaml files
Yes, you can add deployment directives, typically in a dedicated folder of your project, which can in turn use a cicd repository
See "kelseyhightower/pipeline-application" as an example, where:
Changes pushed to any branch except master should trigger the following actions:
build a container image tagged with the build ID suitable for deploying to a staging cluster
clone the pipeline-infrastructure-staging repo
patch the pipeline deployment configuration file with the staging container image and commit the changes to the pipeline-infrastructure-staging repo
The pipeline-infrastructure-staging repo will deploy any updates committed to the master branch.
Please keep in mind that:
The best solution for storing Dockerfile or build config file should be the remote repository. For dockerfiles this is supported as "Native Docker support" in gcloud.
You can use different host repositories like:
Cloud Source Repository
Bitbucket
GitHub
As an example structure for build config yaml file you can find here, informations about cloud build concepts and tutorials.

Usage of SVN private repository with Jenkins instead of GIT

I am keeping all my code in SVN repository within my on-premise server. And also I am trying to implement the CI/CD pipeline for deploying my application. I am trying to use Kubernetes and Jenkins tools for implementing this. When I am exploring the implementation examples of CI/CD pipeline using Jenkins and Kubernetes, I am only seeing example with GIT repository and managing code commits using Webhooks.
Here my confusion is that, I am using SVN code repository. So How I can use my SVN code repository with Jenkins Pipeline Job ? Do I need to install any additional plugin for SVN ? My requirement is that, when I am committing into my SVN code repository, Jenkins need to pull code from code repo and need to build project and need to deploy in test environment.
Hooks to trigger Jenkins from SVN are also possible. Or you can poll the repository for changes - the Jenkins SVN plugin supports both methods (https://wiki.jenkins.io/display/JENKINS/Subversion+Plugin). The examples you are looking at will have a step that does a build from the source code of a particular repo. You should be fine to swap git for SVN and still follow the examples as where and how the source is hosted is not normally related to how to use Jenkins to build and deploy it.

Docker: updating image and registry

What is the right workflow for updating and storing images?
For example:
I download source code from GitHub (project with Docker files, docker-compose.yml)
I run "docker build"
And I push new image to Docker Hub (or AWS ECR)
I make some changes in source code
Push changes to GitHub
And what I should do now to update registry (Docker Hub)?
A) Should I run again "docker build" and then push new image (with new tag) to registry?
B) Should I somehow commit changes to existing image and update existing image on Docker Hub?
This will depend on what for you will use your docker image and what "releasing" policy you adopt.
My recommendation is that you sync the tags you keep on Docker Hub with the release/or tags you have in GitHub and automate as much as you can your production with a continuous integration tools like Jenkins and GitHub webooks.
Then your flow becomes :
You do your code modifications and integrate them in GitHub ideally using a pull request scheme. This means your codes will be merged into your master branch.
Your Jenkins is configured so that when master is changed it will build against your docker file and push it to Docker hub. This will erase your "latest" tag and make sure your latest tag in docker hub is always in sync with your master release on GitHub
If you need to keep additional tags, this will be typical because of different branches or releases of your software. You'll do the same as above with the tag hooked up through Jenkins and GitHub webhooks with a non-master branch. For this, take a look at how the official libraries are organized on GitHub (for example on Postgres or MySQL images).

Resources