CI for multi-repository project - travis-ci

My current project consists of three repositories. There is a Java (Spring Boot) application and two Angular web clients.
At the moment I am running a deploy.sh script which clones each repository and then deploys the whole thing.
# Clone all projects
git clone ..
git clone ..
git clone ..
# Build (there is a pom.xml which depends on the cloned projects)
mvn clean package
# Deploy
heroku deploy:jar server/target/server-*.jar --app $HEROKU_APP -v
Not very nice, I know.
So, I'd like to switch to a CI-pipeline and I think travis-ci or gitlab-ci might be some good choices.
My problem is: At this point I don't know how (or if) I can build the whole thing if there is an update on any the master branches.
Maybe it is possible to configure the pipeline in such a way that it simply tracks each repository or maybe it's possible to accomplish this using git submodules.
How can I approach this?

If you need all of the projects to be built and deployed together, you have a big old monolith. In this case, I advise you to use a single repository for all projects and have a single pipeline. This way you wouldn't need to clone anything.
However, if the java app and the angular clients are microservices that can be built and deployed independently, place them in separate repositories and create a pipeline for each one of them. Try not to couple the release process (pipelines) of the different services because you will regret it later.
Each service should be built, tested and deployed separately.
If you decide to have a multi-repo monolith (please don't) you can look into
Gitlab CI Multi-project Pipelines
Example workflow:
Repo 1 (Java), Repo 2 (Angular 1), Repo 3 (Angular 2)
Repo 1:
On push to master, clones Repo 2 and Repo 3, builds, tests, deploys.
Repo 2:
On push to master, triggers the Repo 1 pipeline.
Repo 3:
On push to master, triggers the Repo 1 pipeline.

Related

How to track deployment and commits of multiple repositories in a single Bitbucket pipeline?

We host a project's source code on Bitbucket, in multiple repositories, one for the backend, one from the frontend, and one for server configuration and deployment.
The deployment is done with a Bitbucket custom pipeline hosted in the latter repository (where "custom" means triggered manually or by a scheduler, not by pushing to branch). In the pipeline, we clone the other repositories (using an SSH key for authentication), build Docker images, push them to a Docker repository, and then trigger the deployment on the server.
This is all working well, except for how it's tracked in Bitbucket and Jira. In Bitbucket, in the pipelines overview, it shows the latest commit that was deployed by a pipeline run. However, since the pipeline is in the config repository, this will only show commits of the config repository. Since the config rarely changes, most of our commits are in the backend and frontend repositories, so this "latest commit" rarely represents the latest change that was deployed.
Similarly, and more annoyingly, when connecting Jira with Bitbucket, Jira only associates commits in the config repository with a deployment. All the interesting work done in the backend and frontend repositories isn't seen.
Is there away to tell Bitbucket that multiple repositories are involved in a pipeline deploy? I believe this is currently not possible, so this would have to be a feature request for Atlassian.
Does anybody know of a workaround? I was thinking, maybe having the backend and frontend repos as git submodules of the config repo might work? Git submodules scare me, so I don't want to try only to find out that Bitbucket/Jira would not see the commits/issues in the submodules anyway.
Another workaround could be to push a dummy commit with a commit message that summarizes all commits done in all repos. That commit would have to be already pushed to the config repo when the pipeline is started, so that would maybe have to be done in a separate pipeline: the first pipeline pushes the summary commit and then triggers the second pipeline for the actual deployment.
Put everything, all software components plus configuration and infrastructure, together in a monorepository.
So as to push such a big change in historically independent repositories, it is worth to use the --allow-unrelated-histories option for the git-merge command so as not to loose each git history.
Otherwise, yes, use git submodules in a parent repo and track submodules refs updates as meaningful commits. If that scares you, you should really not be splitting your code in multiple repos.

Build multiple projects

I'm exploring the Jenkins world to see if it can fit my needs for this case.
I need to build two git repositories (backend and frontend). For the backend, I would need:
Choose the branch we want to build from a list
Checkout the branch and build Docker image using the Dockerfile
push to ECR
release to a specific Kubernetes deployment
After backend build, we have to build the frontend by doing:
Choose the branch we want to build from a list
Checkout the branch and run npm script to build
deploy to S3 folder
Build of the project should be triggered only manually, by the project owner (who is not a developer )
Is Jenkins the right way to go? And if yes, could you point me to how you would do it?
Thanks
Yes, you can definitely implement what you need with Jenkins. There are different ways to implement each step. But here are some things you can consider using.
For Branch listing, you can consider using a plugin like List Git
Branches Plugin
For Docker image Building and pushing you can use Jenkins Docker Steps.
For K8S stuff you can probably use a Shell script or can use something like Kubecli
For S3 stuff you can use S3 Publisher Plugin.

Jenkins how to deploy same code to different servers (that I can specify)?

in the software company I work we use jenkins to deploy to different servers, the way we do that is ever single branch from git repository deploy to the specific server based on the name of the branch and in the specifications on the jenkinsfile. But we are in the process of unification of this branchs in just one: Master, but how we can configure jenkins to catch the same code and deploy to the servers we are interested in, without changing the code? I think we should separate code from deploy, but the pipeline still have to exist in some way.
2 solutions come to mind:
I believe you may be using SCM polling to get the builds started. With git diff you can check what was changed and based on that start the specific deployment.
If you are running the builds manually you can parameterize the build and specify this way which one you want to deploy.
From experience you might want to set the pipeline so when you commit to the repository only testing and building is done and deploy is not done (or only on a test env) and the proper prod deploy is done only manually (and can be parameterized).

Usage of SVN private repository with Jenkins instead of GIT

I am keeping all my code in SVN repository within my on-premise server. And also I am trying to implement the CI/CD pipeline for deploying my application. I am trying to use Kubernetes and Jenkins tools for implementing this. When I am exploring the implementation examples of CI/CD pipeline using Jenkins and Kubernetes, I am only seeing example with GIT repository and managing code commits using Webhooks.
Here my confusion is that, I am using SVN code repository. So How I can use my SVN code repository with Jenkins Pipeline Job ? Do I need to install any additional plugin for SVN ? My requirement is that, when I am committing into my SVN code repository, Jenkins need to pull code from code repo and need to build project and need to deploy in test environment.
Hooks to trigger Jenkins from SVN are also possible. Or you can poll the repository for changes - the Jenkins SVN plugin supports both methods (https://wiki.jenkins.io/display/JENKINS/Subversion+Plugin). The examples you are looking at will have a step that does a build from the source code of a particular repo. You should be fine to swap git for SVN and still follow the examples as where and how the source is hosted is not normally related to how to use Jenkins to build and deploy it.

Poll SCM multiple repositories on Jenkins

I have around 10 repositories that I would like to poll. If a folder is a added in the root folder of any repo, I'd like to trigger a certain build, (the same).
I thought using the Poll SCM plugin but it requires one job per repo and it's not scalable.
Is there any clean way to do this and any plugin that would help?
EDIT: I have a job generating debian packages from folders that are in my 10 repositories (each folder corresponds to a separate package). When a new folder is added, it means a new package is.
I would like then to trigger a packaging build so developers can fetch it from our apt repository without waiting the nightly build
You can use this plugin:
https://wiki.jenkins.io/display/JENKINS/Pipeline+Multibranch+Plugin
As per the manual:
Enhances Pipeline plugin to handle branches better by automatically
grouping builds from different branches. Automatically creates a new
Jenkins job whenever a new branch is pushed to a source code
repository. Other plugins can define various branch types, e.g. a Git
branch, a Subversion branch, a GitHub Pull Request etc.
See this blog post for more info:https://jenkins.io/blog/2015/12/03/pipeline-as-code-with-multibranch-workflows-in-jenkins/ "

Resources