CDK pipelines with several application repositories - aws-cdk

I have successful setup a pipeline for my application with CDK Pipelines construct that was introduced this summer. The application is a microservice (UserService) compiled with CodeBuild and creates a ECS Fargate service. The source of the application is in a GitHub repository.
The project stored in GitHub repository is like this:
.
+-- cdk
+-- Dockerfile_OrderService
+-- Dockerfile_ProductService
+-- Dockerfile_UserService
+-- OrderService
+-- ProductService
+-- UserService
OrderService, ProductService and UserService are folders containing source code of the microservices that needs to be compiled with CodeBuild. I have only implemented UserService so far, and that works fine. When I push a change from the UserService folder, the pipeline is triggered, and the source code are built with CodeBuild, and the service is deployed to ECS Fargate.
When I set up a pipeline for the other two services, a push from any of the services folders will trigger CodePipeline for all three services. I don't want that, I want the pipeline for the specific service is triggered, not the other two, but I am not sure how to do that.
I was thinking about each service to have it's own repository, but I also need the infrastructure code under cdk to be present.
Do anyone have an example of how to do this?

My team uses a monorepo with different cdk-pipelines for each product/service that we offer. It does trigger each pipeline when we push to the develop branch, but if there are no changes in 'ProductService' or 'OrderService' then technically I don't think there's no harm letting it update all of them.
But if you do want separate triggers, you would have to use separate branches that trigger each microservice. You are allowed to specify a branch for 'GitHubSourceActionProps.Branch'. For example, the pipeline construct for 'UserService' could look like this: (C#)
var pipeline = new Amazon.CDK.Pipelines.CdkPipeline(this, "My-Cdk-Pipeline", new CdkPipelineProps()
{
SourceAction = new GitHubSourceAction(new GitHubSourceActionProps()
{
ActionName = "GitHubSourceAction",
Branch = user-service,
Repo = "my-cool-repo",
Trigger = GitHubTrigger.WEBHOOK,
}),
};
aws cdk api reference githubsourceactionprops

Related

Use utility scripts in Jenkins Templating Engine

I'm a newbie to Jenkins Templating Engine, and trying to implement a CI process using JTE. Usually, when using Jenkins, i have a repo named DevOps in which i keep all the utility scripts that i'm using in the CI process instead of placing them in the developers' source code repositories. So in each Jenkinsfile i check out the source code as well as the DevOps repo into the workspace and use it.
I was wondering if this is considered a best practice in general. If not, what is? and if is, what is the best way to imitate that kind of pattern in JTE, given that my fresh new Jenkinsfile looks like:
checkout scm
build()
deploy()
Checkout scm elegantly checks out the source code repository defined in the SCM plugin in the configuration page. Where can i embed another checkout step that will clone the DevOps repo and still keep the JTE infrastructure generic?
coming in JTE 2.0 you'll be able to store and access library resources.
merged PR for the feature: https://github.com/jenkinsci/templating-engine-plugin/pull/102

Jenkins Pipeline building micro-services with multiple repos

I'm trying to put together a Jenkins pipeline that builds a docker application composed of multiple containers. Each service is in it's own git repository.
i.e.
Service1 github.com/testproject/service1
Service2 github.com/testproject/service2
Service3 github.com/testproject/service3
I can create a Jenkinsfile that builds the individual services, but I'd like a way to build and test the application end-to-end if any single service changes (avoiding rebuilding the unchanged services).
I could maintain 3 separate Jenkinsfiles, and 3 separate pipelines to achieve this, but it seems like a lot of duplication. Is there a way to have a single pipeline that will let me achieve this ?

CI for multi-repository project

My current project consists of three repositories. There is a Java (Spring Boot) application and two Angular web clients.
At the moment I am running a deploy.sh script which clones each repository and then deploys the whole thing.
# Clone all projects
git clone ..
git clone ..
git clone ..
# Build (there is a pom.xml which depends on the cloned projects)
mvn clean package
# Deploy
heroku deploy:jar server/target/server-*.jar --app $HEROKU_APP -v
Not very nice, I know.
So, I'd like to switch to a CI-pipeline and I think travis-ci or gitlab-ci might be some good choices.
My problem is: At this point I don't know how (or if) I can build the whole thing if there is an update on any the master branches.
Maybe it is possible to configure the pipeline in such a way that it simply tracks each repository or maybe it's possible to accomplish this using git submodules.
How can I approach this?
If you need all of the projects to be built and deployed together, you have a big old monolith. In this case, I advise you to use a single repository for all projects and have a single pipeline. This way you wouldn't need to clone anything.
However, if the java app and the angular clients are microservices that can be built and deployed independently, place them in separate repositories and create a pipeline for each one of them. Try not to couple the release process (pipelines) of the different services because you will regret it later.
Each service should be built, tested and deployed separately.
If you decide to have a multi-repo monolith (please don't) you can look into
Gitlab CI Multi-project Pipelines
Example workflow:
Repo 1 (Java), Repo 2 (Angular 1), Repo 3 (Angular 2)
Repo 1:
On push to master, clones Repo 2 and Repo 3, builds, tests, deploys.
Repo 2:
On push to master, triggers the Repo 1 pipeline.
Repo 3:
On push to master, triggers the Repo 1 pipeline.

Jenkins CI/CD setup for a microservices system

I have a system with dozens of microservices, all build and released the same way - each is in a docker container, and deployed in a Kubernetes cluster.
There are multiple clusters (Dev1, dev2, QA ... Prod)
We are using Jenkins to deploy each microservice. Each microservice has its own pipeline and this pipelines is duplicated for each environment, like so:
DEV1 (view)
dev1_microserviceA (job / pipline)
dev1_microserviceB
...
dev1_microserviceX
DEV2
dev1_microserviceA
dev1_microserviceB
...
dev1_microserviceX
...
PROD
dev1_microserviceA
dev1_microserviceB
...
dev1_microserviceX
each of those pipelines is almost identical, differences are really just a matter of parameters like environment, name of the microservice, name of git repo.
Some common code is in libraries that each pipeline uses. Is this the proper / typical setup and most refactored setup? I'd like to avoid having to create a pipeline for each microservice and for each envionment but not sure what are my further refactoring options. I am new to Jenkins & devops.
I've looked into parametrized pipelines but I do not want to have to enter a parameter each time I need to build, and I also need to be able to chain builds, and see the results of all builds at a glance, in each environment.
I would use Declarative Pipelines where you can define your logic in a local Jenkinsfile in your repositories.
Using Jenkins, you can have a "master" Jenkinsfile and/or project that you can inherit by invoking the upstream project. This will allow you to effectively share your instructions and reduce duplication.
What is typically never covered when it comes to CI/CD is the "management" of deployments. Since most CI/CD services are stateless it has no notion of applications deployed.
GitLab has come a long way with this but Jenkins is far behind.
At the end of the day you will have to either create a separate project for each repository/purpose due to how Jenkins works OR (recommended) have a "master" project that let's you pass in things like project name, git repo url, specific application variables and values and so on.

Jenkins Docker image building for Different Tenant from same code repository

I am trying to implement CI/CD pipeline for my Spring Boot micro service deployment. I am planned to use Jenkins and Kubernetes for Making CI/CD pipeline. And I have one SVN code repository for version control.
Nature Of Application
Nature of my application is, one microservice need to deploy for multiple tenant. Actually code is same but database configuration is different for different tenant. And I am managing the configuration using Spring cloud config server.
My Requirement
My requirement is that, when I am committing code into my SVN code repository, then Jenkins need to pull my code, build project (Maven), And need to create Docker Image for multiple tenant. And need to deploy.
Here the thing is that, commit to one code repository need to build multiple docker image from same code repo. Means one code repo - multiple docker image building process. Actually, Dockerfile containing different config for different docker image ie. for different tenant. So here my requirement is that I need to build multiple docker images for different tenant with different configuration added in Dockerfile from one code repo by using Jenkins
My Analysis
I am currently planning to do this by adding multiple Jenkins pipeline job connect to same code repo. And within Jenkins pipeline job, I can add different configuration. Because Image name for different tenant need to keepdifferent and need to push image into Dockerhub.
My Confusion
Here my confusion is that,
Can I add multiple pipeline job from same code repository using Jenkins?
If I can add multiple pipeline job from same code repo, How I can deploy image for every tenant to kubernetes ? Do I need to add jobs for deployment? Or one single job is enough to deploy?
You seem to be going about it a bit wrong.
Since your code is same for all the tenants and only difference is config, you should better create a single docker image and deploy it along with tenant specific configuration when deploying to Kubernetes.
So, your changes in your repository will trigger one Jenkins build and produce one docker image. Then you can have either multiple Jenkins jobs or multiple steps in pipeline which deploy the docker image with tenant specific config to Kubernetes.
If you don't want to heed to above, here are the answers to your questions:
You can create multiple pipelines from same repository in Jenkins. (Select New item > pipeline multiple times).
You can keep a list of tenants and just loop through OR run all deployments in parallel in a single pipeline stage.

Resources