Continuous Delivery with VSTS and Jenkins - jenkins

I'm trying to get continuous delivery going with Jenkins (building, deploying) and VSTS (source control). This is the desired workflow:
a developer branches off master, makes changes, creates a pull request
another developer reviews the PR and eventually merges it into master
some system (Jenkins or VSTS) detects that a PR was merged into master and...
increments a version number stored in a file within the repo
commits the version change back to master
builds
deploys
I was using Service Hooks within VSTS to detect the merge to master and execute a Jenkins task. VSTS has 3 hooks I can use:
Build completed
Code pushed
Pull request merge commit created
I was under the impression that the third option would only occur when a PR was merged, but that's not the case. Any additional commits to the branch, while it's associated with the PR triggers the hook. This causes a bunch of unnecessary deployments.
I figured I could make Jenkins detect changes within VSTS. There's a "Poll SCM" option, which takes a cron-like schedule. The utterly confusing thing is, it doesn't appear that I can configure what exactly will be polled every X minutes (which repo, which branch).
What are my options here to trigger Jenkins tasks only when a PR is merged to master? I would use the VSTS "Code pushed" Service Hook, but it goes into an infinite loop because Jenkins pushes to master when it increments the version.

Refer to these steps below:
Create a new build definition for master
Select Triggers and enable Continuous Integration
Set the Branch Filters and Path filters (exclude the version number file that need to be changed)
Add task to modify version number file (e.g. PowerShell)
Add Command Line task (Tool: git; Arguments: config --global user.email “test#example.com”; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool: git; Arguments: config --global user.name "Your Name"; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool: git; Arguments: add [the modified file]; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool: git; Arguments: commit -m “update version”; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool:git; Arguments: push origin HEAD:$(Build.SourceBranchName) ”; Working folder: $(build.sourcesdirectory))
Add Jenkins Queue Job task to trigger Jenkins job

Related

Jenkins with Gerrit - Triggering a CI job on a multi-repo project, configure trigger for multiple commits that have the same subject

I am working on a project that is configured within multiple git repositories and managed by a manifest.xml with the repo tool.
On a daily basis, changes for each git repo are submitted on Gerrit and currently attempting to implement the gerrit trigger with Jenkins to execute a job. That part is indeed working.
The issue comes when some changes relate to 2 or more different git repos (having the same ID/subject of change on the commit message). As currently every-single git repo is monitored and a job is triggered upon a commit.
How will it be possible to prevent the individual trigger if there are 2 or more commits to be made and necessary to be built together? And trigger the build only once all the relevant commits are available on gerrit?
I think you should use the field "Add TOPIC" in "Dynamic Trigger Configuration" which could be set across gits.

Jenkins build from workspace if Git is unavailable

We're running a number of periodic jobs on a schedule and if Git is not available due to maintenance or an outage (clones from an on-network Github Enterprise instance), the jobs fail. Is there any way to configure jobs so that they can build from the existing workspace if Git is down or inaccessible? Thanks!
The first step is to make sure your periodic job does not cleanup the workspace after its build.
Second, split your pipeline into two stages:
one for the git ls-remote, followed by, if it works, a workspace cleanup and a clone: you can also use try catch if the ls-remote fails, meaning the remote is not available: log a warning, and move on the the second stage
one for the job itself

How to run Jenkins pipeline automatically when "git push" happens for specific folder in bitbucket

I have started using Jenkins recently and there is only scenario where I am stuck. I need to run "Jenkins pipeline" automatically when git push happens for specific folder in master branch. Only if something is added to specific folder, than pipeline should run.
I have already tried SCM with Sparse checkout path, and mentioned my folder, but that's not working.
I am using GUI free style project, I dont know groovy.
I had the same issue and I resolved it by configuring the Git poll.
I used poll SCM to trigger a build and I used the additional behavior of Jenkins Git plugin named "Polling ignores commits in certain paths" > "Included Regions" : my_specific_folder/.*
By the way, using Sparse checkout path allows jenkins to checkout only the folder you mentioned.

post-commit hook : run the jenkins job based on project changed in git repository

I have configured individual jenkins jobs for each project. in my case, whenever there is a commit, all the jobs are getting triggered. how do I make sure only that job pertinent to the project runs and creates the deployable artifacts.
What URL have you configured in GIT repo? Does it contain repo name? Does each project has it's own repository?
When you call curl http://yourserver/jenkins/git/notifyCommit?url=<URL of the Git repository> jenkins will scan all the jobs that are configured to check out the specified URL. Do you have multiple jobs using the same repository?
You can try to use build triggers - you will be able to invoke a job by its name.

How to ensure same git checkout for build and deploy jobs in Jenkins?

In Jenkins, I have a "Build" job setup to poll my git repo and automatically build on change. Then, I have separate "Deploy to DEV", "Deploy to QA", etc. jobs that will call an Ant build that deploys appropriately. Currently, this configuration works great.
However, this process favors deploying the latest build on the latest development branch. I use the Copy Artifact plugin to allow the user to choose which build to deploy. Also, the Ant scripts for build/deploy are part of the repo and are subject to change. This means it's possible the artifact could be incompatible between versions. So, it's ideal that I ensure that the build and deploy jobs are run using the same git checkout.
Is there an easier way? It ought to be possible for the Deploy job to obtain the git checkout hash used from the selected build and checkout. However, I don't see any options or plugins that do this.
Any ideas on how to simplify this configuration?
You can use Parameterized Trigger Plugin to do this for you. The straight way is to prepare file with parameters as a build step and pass this parameters to the downstream job using the plugin. You can pass git revision as a parameter for example or other settings.
The details would vary for a Git repo (see https://stackoverflow.com/a/13117975/466874), but for our SVN-based jobs, what we do is have the build job (re)create an SVN tag (with a static name like "LatestSuccessfulBuild") at successful completion, and then we configure the deployment jobs to use that tag as their repo URL rather than the trunk location. This ensures that deployments are always of whatever revision was successfully built by the build job (meaning all unit tests passed, etc.) rather than allowing newer trunk commits to sneak into the deployment build.

Resources