Build multiple projects - jenkins

I'm exploring the Jenkins world to see if it can fit my needs for this case.
I need to build two git repositories (backend and frontend). For the backend, I would need:
Choose the branch we want to build from a list
Checkout the branch and build Docker image using the Dockerfile
push to ECR
release to a specific Kubernetes deployment
After backend build, we have to build the frontend by doing:
Choose the branch we want to build from a list
Checkout the branch and run npm script to build
deploy to S3 folder
Build of the project should be triggered only manually, by the project owner (who is not a developer )
Is Jenkins the right way to go? And if yes, could you point me to how you would do it?
Thanks

Yes, you can definitely implement what you need with Jenkins. There are different ways to implement each step. But here are some things you can consider using.
For Branch listing, you can consider using a plugin like List Git
Branches Plugin
For Docker image Building and pushing you can use Jenkins Docker Steps.
For K8S stuff you can probably use a Shell script or can use something like Kubecli
For S3 stuff you can use S3 Publisher Plugin.

Related

Simple CICD workflow for small-scale deployments?

I work for a small startup. We have 3 environments (Production, Development, and Staging) and GitHub is used as VCS.
All env runs on EC2 with docker.
Can someone suggest me a simple CICD solution that can trigger builds automatically after certain branches are merged / manual trigger option?
Like, if anything in merged into dev-merge, build and deploy to development, and the same for staging and pushing the image to ECR and rolling out docker update.
We tried Jenkins but we felt it was over-complicated for our small-scale infra.
GitHub actions are also evaluated (self-hosted runners), but it needs YAMLs to be there in repos.
We are looking for something that can give us option to modify the pipeline or overall flow without code-hosted CICD config. (Like the way Jenkins gives option to either use Jenkins file or configure the job manually via GUI)
Any opinions about Team City?

CI for multi-repository project

My current project consists of three repositories. There is a Java (Spring Boot) application and two Angular web clients.
At the moment I am running a deploy.sh script which clones each repository and then deploys the whole thing.
# Clone all projects
git clone ..
git clone ..
git clone ..
# Build (there is a pom.xml which depends on the cloned projects)
mvn clean package
# Deploy
heroku deploy:jar server/target/server-*.jar --app $HEROKU_APP -v
Not very nice, I know.
So, I'd like to switch to a CI-pipeline and I think travis-ci or gitlab-ci might be some good choices.
My problem is: At this point I don't know how (or if) I can build the whole thing if there is an update on any the master branches.
Maybe it is possible to configure the pipeline in such a way that it simply tracks each repository or maybe it's possible to accomplish this using git submodules.
How can I approach this?
If you need all of the projects to be built and deployed together, you have a big old monolith. In this case, I advise you to use a single repository for all projects and have a single pipeline. This way you wouldn't need to clone anything.
However, if the java app and the angular clients are microservices that can be built and deployed independently, place them in separate repositories and create a pipeline for each one of them. Try not to couple the release process (pipelines) of the different services because you will regret it later.
Each service should be built, tested and deployed separately.
If you decide to have a multi-repo monolith (please don't) you can look into
Gitlab CI Multi-project Pipelines
Example workflow:
Repo 1 (Java), Repo 2 (Angular 1), Repo 3 (Angular 2)
Repo 1:
On push to master, clones Repo 2 and Repo 3, builds, tests, deploys.
Repo 2:
On push to master, triggers the Repo 1 pipeline.
Repo 3:
On push to master, triggers the Repo 1 pipeline.

Cloud build CI/CD & k8s files

I am using cloud build and GKE k8s cluster and i have setup CI/CD from github to cloud build.
I want to know is it good to add CI build file and Dockerfile in the repository or manage config file separately in another repository?
Is it good to add Ci & k8s config files with business logic repository?
What is best way to implement CI/CD cloud build to GKE with managing CI/k8 yaml files
Yes, you can add deployment directives, typically in a dedicated folder of your project, which can in turn use a cicd repository
See "kelseyhightower/pipeline-application" as an example, where:
Changes pushed to any branch except master should trigger the following actions:
build a container image tagged with the build ID suitable for deploying to a staging cluster
clone the pipeline-infrastructure-staging repo
patch the pipeline deployment configuration file with the staging container image and commit the changes to the pipeline-infrastructure-staging repo
The pipeline-infrastructure-staging repo will deploy any updates committed to the master branch.
Please keep in mind that:
The best solution for storing Dockerfile or build config file should be the remote repository. For dockerfiles this is supported as "Native Docker support" in gcloud.
You can use different host repositories like:
Cloud Source Repository
Bitbucket
GitHub
As an example structure for build config yaml file you can find here, informations about cloud build concepts and tutorials.

Triggering CI builds using Jenkins on remote build machine

I am trying to implement CICD with Jenkins. I have my code in git repo. The moment I make a change to git repo files, I wish to trigger a Build that should run on remote machine.
This means If I change a file in Git Repo 10 times, I should have 10 Builds, each build corresponding to one change.
Can anyone tell me how this can be done ?
I tried to make use of post-commit hook, but its not working.
What flavor of GIT? Do you use? If you share you config details of webhook and Jenkins additional info can be provided. Per my experience it is a two step process.
Enable the webhook in GIT
Create a job with appropriate configuration to map to the repository and get triggered on commit

Using Jenkins for Continuous Deployment of WebApp - Publish Artifacts to Server

We are searching for a CI and CD Solution for our WebApp based on NodeJS/Meteor.
Our Process should be:
On each Push to Master/ Pull Request/ Merge to Master do the following:
Checkout
Run Code Style Checks (coffeelint, scsslinter, etc.)
Build Code
Run Tests
Generate Tarball-Archive
Deploy archive to Developmet (Quality Management) Server, extract and run
next step would be manual testing of the app on our dev server.
when we think it is deployable, I want have a button in jenkins like "Deploy these Artifacts NOW to Live-Instance". How can I achive this? Also Nice would be something like deploy these artifacts at 2am to the live instances.
From Checkout to deploy to dev-server is already implemented in jenkins. What we need now is a button "deploy this artifact to live"
You need another job to get this working. The other job takes the artifact from the build job and deploy it wherever you want.
There is no possibility to include such behavior in the same job.

Resources