How to track deployment and commits of multiple repositories in a single Bitbucket pipeline? - jira

We host a project's source code on Bitbucket, in multiple repositories, one for the backend, one from the frontend, and one for server configuration and deployment.
The deployment is done with a Bitbucket custom pipeline hosted in the latter repository (where "custom" means triggered manually or by a scheduler, not by pushing to branch). In the pipeline, we clone the other repositories (using an SSH key for authentication), build Docker images, push them to a Docker repository, and then trigger the deployment on the server.
This is all working well, except for how it's tracked in Bitbucket and Jira. In Bitbucket, in the pipelines overview, it shows the latest commit that was deployed by a pipeline run. However, since the pipeline is in the config repository, this will only show commits of the config repository. Since the config rarely changes, most of our commits are in the backend and frontend repositories, so this "latest commit" rarely represents the latest change that was deployed.
Similarly, and more annoyingly, when connecting Jira with Bitbucket, Jira only associates commits in the config repository with a deployment. All the interesting work done in the backend and frontend repositories isn't seen.
Is there away to tell Bitbucket that multiple repositories are involved in a pipeline deploy? I believe this is currently not possible, so this would have to be a feature request for Atlassian.
Does anybody know of a workaround? I was thinking, maybe having the backend and frontend repos as git submodules of the config repo might work? Git submodules scare me, so I don't want to try only to find out that Bitbucket/Jira would not see the commits/issues in the submodules anyway.
Another workaround could be to push a dummy commit with a commit message that summarizes all commits done in all repos. That commit would have to be already pushed to the config repo when the pipeline is started, so that would maybe have to be done in a separate pipeline: the first pipeline pushes the summary commit and then triggers the second pipeline for the actual deployment.

Put everything, all software components plus configuration and infrastructure, together in a monorepository.
So as to push such a big change in historically independent repositories, it is worth to use the --allow-unrelated-histories option for the git-merge command so as not to loose each git history.
Otherwise, yes, use git submodules in a parent repo and track submodules refs updates as meaningful commits. If that scares you, you should really not be splitting your code in multiple repos.

Related

CI for multi-repository project

My current project consists of three repositories. There is a Java (Spring Boot) application and two Angular web clients.
At the moment I am running a deploy.sh script which clones each repository and then deploys the whole thing.
# Clone all projects
git clone ..
git clone ..
git clone ..
# Build (there is a pom.xml which depends on the cloned projects)
mvn clean package
# Deploy
heroku deploy:jar server/target/server-*.jar --app $HEROKU_APP -v
Not very nice, I know.
So, I'd like to switch to a CI-pipeline and I think travis-ci or gitlab-ci might be some good choices.
My problem is: At this point I don't know how (or if) I can build the whole thing if there is an update on any the master branches.
Maybe it is possible to configure the pipeline in such a way that it simply tracks each repository or maybe it's possible to accomplish this using git submodules.
How can I approach this?
If you need all of the projects to be built and deployed together, you have a big old monolith. In this case, I advise you to use a single repository for all projects and have a single pipeline. This way you wouldn't need to clone anything.
However, if the java app and the angular clients are microservices that can be built and deployed independently, place them in separate repositories and create a pipeline for each one of them. Try not to couple the release process (pipelines) of the different services because you will regret it later.
Each service should be built, tested and deployed separately.
If you decide to have a multi-repo monolith (please don't) you can look into
Gitlab CI Multi-project Pipelines
Example workflow:
Repo 1 (Java), Repo 2 (Angular 1), Repo 3 (Angular 2)
Repo 1:
On push to master, clones Repo 2 and Repo 3, builds, tests, deploys.
Repo 2:
On push to master, triggers the Repo 1 pipeline.
Repo 3:
On push to master, triggers the Repo 1 pipeline.

Usage of SVN private repository with Jenkins instead of GIT

I am keeping all my code in SVN repository within my on-premise server. And also I am trying to implement the CI/CD pipeline for deploying my application. I am trying to use Kubernetes and Jenkins tools for implementing this. When I am exploring the implementation examples of CI/CD pipeline using Jenkins and Kubernetes, I am only seeing example with GIT repository and managing code commits using Webhooks.
Here my confusion is that, I am using SVN code repository. So How I can use my SVN code repository with Jenkins Pipeline Job ? Do I need to install any additional plugin for SVN ? My requirement is that, when I am committing into my SVN code repository, Jenkins need to pull code from code repo and need to build project and need to deploy in test environment.
Hooks to trigger Jenkins from SVN are also possible. Or you can poll the repository for changes - the Jenkins SVN plugin supports both methods (https://wiki.jenkins.io/display/JENKINS/Subversion+Plugin). The examples you are looking at will have a step that does a build from the source code of a particular repo. You should be fine to swap git for SVN and still follow the examples as where and how the source is hosted is not normally related to how to use Jenkins to build and deploy it.

How to trigger Jenkins builds for PR for a specific fork in Bitbucket

We have a "blessed" repository, and every team forks this repository with auto-sync enabled (blessed -> fork). In their own forked repo, the team creates their feature branches. Then they make a pull request towards the main branch on the blessed repository.
Because of the auto-sync, they cannot change the pipeline configuration in their own repo.
We would like Jenkins to limit the pull request discovery to pull requests originating from only one fork of one team.
How can we do this.
Forks are effectively separate repositories - they have different paths, different ownership, and different permissions, even though they have some shared code history.
If you're having Bitbucket send webhooks to Jenkins to trigger builds, then remove that configuration for any fork that should not have it. If you're having Jenkins poll for updates, then update Jenkins so that it only polls on the fork(s) that should build through Jenkins. And if you're moving to Pipelines, then the Pipelines YAML is distinct for each fork.

Run Jenkins build for whichever branch was checked into on Gitlab

I recently made the transition from Subversion to Git for all my repos at work. However, with svn we had commit hooks in place so that our Jenkins job would run for whichever branch was checked into. Now, I'm trying to set this up using Gitlab and there appears to only be one place to add a web hook. It looks like any time something is checked into ANY branch, the web hook will run. Meaning if I have a branch_A associated with jenkins_job_A, something could be checked into branch_B and the commit hook for jenkins_job_A will still run. Is there a branch by branch way to configure these web hooks? Or is there some kind of script I can check into each branch that will act as a commit hook? Or (my fear) is this feature not supported in Gitlab?
I guess you set up GitLab to do a post commit request to http://yourserver/jenkins/git/notifyCommit?url=<URL of the Git repository>? In theory this should trigger the polling on all jobs that are configured with that URL, and in the polling step the jobs should decide whether they should build or not. In practice this will unfortunately cause all jobs to fire.
We worked around this issue by moving the Job configuration into a Jenkinsfile and then use a Multibranch Pipeline.
As an alternative you could also install the GitLab plugin for Jenkins and use the Jenkins integration in GitLab. This will allow you to trigger the correct jobs when commits are pushed on a branch. The downside is that it requires a per-job configuration.

How to build the new branch pushed to github using Jenkins CI?

I've setup the Jenkins for the rails3 app to build the specs.
One can find many posts via google on how to setup the build trigger on the github push.
But what I want is to build the new remote branch pushed to Github.
e.g.
I've a repo origin/master. I cloned the repo, created a new branch, did some commits and pushed that branch to origin git push -u origin new_branch
Now I want the Jenkins to build this newly pushed branch on the origin.
If the build is successful, then Jenkins should merge it into origin/master automatically.
The Jenkins plugin has github, git plugin. But it requires to put the branch name. Instead I want to build the new_branch dynamically.
How can I setup such process?
If I remember correctly branch name is not a required entry. You need to test it, but I think if you do not fill it, Jenkins tests all new commit in the repo regardless which branch is affected.
But I recommend you do not merge automatically. You do not want that, trust me. :-)
It seems can not do that with only github and gitgub parameter plugin. If you specify branch_regex*** in Branch to build, Jenkins always build the latest commit in the bunch of branches that it saw. Must specify a branch in order Jenkins to build on the latest commit in that branch. I also see some answer with Multi Branch Pipeline but not sure how to deploy that way. There is no specific instruction at all.

Resources