Why jenkins start all pipeline even if there's no changes for that particular folders? - jenkins

I have microservice based application in Bitbucket. For all service need to deploy I'm using Jenkins pipelines. And each jenkinsfile is in root directory. Whenever a commit is pushed, all the pipeline starts even if there's no changes related to that particular build.
How to set the jenkins in such a way that it only trigger the pipeline of particular service where changes is made and pushed?

Related

How to use Jenkins pipeline to update a website on another server?

I've setup and connected a Jenkins (2.249) server to my GitHub account, so it has access to my repos and I've setup the GitHub webhook.
But I am having problems trying to understand how to create a multibranch pipeline job to detect when a push to my master branch happens and then I want to run SSH commands on another host to update a web server with the new code changes.
With Jenkins pipelines, I can't see how to detect when a push to master happens and then trigger the build? Is this possible with Jenkins? I have Blue Ocean installed as well.
Multi-branch Pipeline jobs periodically check the server for updates. It sets an environment variable BRANCH_NAME with the current branch during execution.
If you only care about the master branch, you should use a regular Pipeline job that only watches master.
See the docs

github webhook to trigger job on multiple jenkins server

My scenario:
Single github repo
4 files in the repo
4 jenkins server
Desired outcome:
pushing changes in file A, should trigger job in jenkins server A, pushing changes in file B, should trigger job in jenkins server B and so on.
Is there any solution for this?
Depending where your repository is hosted (github, gitlab, bitbucket) you might be able to use a pipeline within the repository to look for the changed files and trigger a webhook on the correct server accordingly.
Another approach could be to always trigger a special job on server A which then looks for changed files and triggers the correct job on the right server.

Ephemeral Jenkins Pipeline Jobs from Github and Jenkinsfile

I have automated Jenkins master and slaves deployment and redeployment successfully.
I know how to manually create pipeline jobs and add github repos to use their Jenkinsfiles for the steps.
my issue is how can I automate the pipeline jobs addition to jenkins after its been destroyed and redeployed without having to manually create the pipeline jobs and point to Jenkinsfile each time.
I have seen this done before in a container environment with chef and docker when redeployed or updated it re-adds all the pipelines automatically again.
I want to not use the UI at all only to confirm job status progress and verify settings.
I would recommend looking at the JobDSL Plugin to create jobs, using a seed job to create them on initial Jenkins startup. The Jenkins Configuration-as-Code plugin can be used to setup any other configuration outside the jobs.

Run Jenkins build for whichever branch was checked into on Gitlab

I recently made the transition from Subversion to Git for all my repos at work. However, with svn we had commit hooks in place so that our Jenkins job would run for whichever branch was checked into. Now, I'm trying to set this up using Gitlab and there appears to only be one place to add a web hook. It looks like any time something is checked into ANY branch, the web hook will run. Meaning if I have a branch_A associated with jenkins_job_A, something could be checked into branch_B and the commit hook for jenkins_job_A will still run. Is there a branch by branch way to configure these web hooks? Or is there some kind of script I can check into each branch that will act as a commit hook? Or (my fear) is this feature not supported in Gitlab?
I guess you set up GitLab to do a post commit request to http://yourserver/jenkins/git/notifyCommit?url=<URL of the Git repository>? In theory this should trigger the polling on all jobs that are configured with that URL, and in the polling step the jobs should decide whether they should build or not. In practice this will unfortunately cause all jobs to fire.
We worked around this issue by moving the Job configuration into a Jenkinsfile and then use a Multibranch Pipeline.
As an alternative you could also install the GitLab plugin for Jenkins and use the Jenkins integration in GitLab. This will allow you to trigger the correct jobs when commits are pushed on a branch. The downside is that it requires a per-job configuration.

Run script before removing job in Jenkins Pipelines

I'm setting up a development environment where I have Jenkins as CI server (using pipelines), and the last build step in Jenkinsfile is a deployment to staging. The idea is to have a staging environment for each branch that is pushed.
Whenever someone deletes a branch (sometimes after merging), Jenkins automatically removes its respective job.
I wonder if there is a way to run a custom script before the automatic job removal, then I would be able to connect to the staging server and stop or remove all services that are running for the job that is going to be deleted.
The plugin multibranch-action-triggers-plugin might be worth a look.
This plugin enables building/triggering other jobs when a Pipeline job is created or deleted, or when a Run (also known as Build) is deleted by a Multi Branch Pipeline Job.

Resources