Jenkins build from workspace if Git is unavailable - jenkins

We're running a number of periodic jobs on a schedule and if Git is not available due to maintenance or an outage (clones from an on-network Github Enterprise instance), the jobs fail. Is there any way to configure jobs so that they can build from the existing workspace if Git is down or inaccessible? Thanks!

The first step is to make sure your periodic job does not cleanup the workspace after its build.
Second, split your pipeline into two stages:
one for the git ls-remote, followed by, if it works, a workspace cleanup and a clone: you can also use try catch if the ls-remote fails, meaning the remote is not available: log a warning, and move on the the second stage
one for the job itself

Related

Trigger Jenkins downstream job when upstream push to Git Master only

As testing team, I wish to only run my Test when Developer is merged their code to Master. However, in Jenkins, my job is triggered everytime Developers push in their code no matter on Branches or Master.
I am using the Build after other projects are built as Build Triggers, and I cant see any option for me to filter the branch like what in TeamCity.
Can anyone help?

Continuous Delivery with VSTS and Jenkins

I'm trying to get continuous delivery going with Jenkins (building, deploying) and VSTS (source control). This is the desired workflow:
a developer branches off master, makes changes, creates a pull request
another developer reviews the PR and eventually merges it into master
some system (Jenkins or VSTS) detects that a PR was merged into master and...
increments a version number stored in a file within the repo
commits the version change back to master
builds
deploys
I was using Service Hooks within VSTS to detect the merge to master and execute a Jenkins task. VSTS has 3 hooks I can use:
Build completed
Code pushed
Pull request merge commit created
I was under the impression that the third option would only occur when a PR was merged, but that's not the case. Any additional commits to the branch, while it's associated with the PR triggers the hook. This causes a bunch of unnecessary deployments.
I figured I could make Jenkins detect changes within VSTS. There's a "Poll SCM" option, which takes a cron-like schedule. The utterly confusing thing is, it doesn't appear that I can configure what exactly will be polled every X minutes (which repo, which branch).
What are my options here to trigger Jenkins tasks only when a PR is merged to master? I would use the VSTS "Code pushed" Service Hook, but it goes into an infinite loop because Jenkins pushes to master when it increments the version.
Refer to these steps below:
Create a new build definition for master
Select Triggers and enable Continuous Integration
Set the Branch Filters and Path filters (exclude the version number file that need to be changed)
Add task to modify version number file (e.g. PowerShell)
Add Command Line task (Tool: git; Arguments: config --global user.email “test#example.com”; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool: git; Arguments: config --global user.name "Your Name"; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool: git; Arguments: add [the modified file]; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool: git; Arguments: commit -m “update version”; Working folder: $(build.sourcesdirectory))
Add Command Line task (Tool:git; Arguments: push origin HEAD:$(Build.SourceBranchName) ”; Working folder: $(build.sourcesdirectory))
Add Jenkins Queue Job task to trigger Jenkins job

Get Git commit from "upstream" build in manually triggered Jenkins job

I have a Build job in Jenkins that checks out a specific Git commit and packages it for deployment as artifacts.
There is a later Deployment job that takes the built artifacts and actually deploys the code. It also does a sparse Git checkout of a specific directory containing deployment scripts. After successful completion, we write a Git tag.
The problem is that the tag is being written to the HEAD of master, not to the hash of the commit used for the original upstream build. (master is the branch defined in the job configuration.)
Is there a way to get the upstream SCM information if it's not passed directly through a parameterized trigger? I can see commits listed in the build.xml file that Jenkins generates in the build directory; is there a way to read this information from the downstream job?
I realize that it's not really "downstream", since it's manually triggered. We do have a selector that defines UPSTREAM_BUILD and UPSTREAM_PROJECT, though.
If you are using the Copy Artifact plugin, you could write a file with the commit hash during the Build job and read it back in during the Deployment job:
# Build
echo ${GIT_COMMIT} > COMMIT_HASH
# Deployment, after copying COMMIT_HASH into the workspace
git checkout $(cat COMMIT_HASH)

git clone only on new change and archive scm

I am using to Jenkins to pull code from git in every 10 minutes and then compiling, archiving it for other jobs to clone this workspace. Currently it's pulling code from git every time and then archiving every time.
I want to clone code from git only if there is any new change else it should skip and do not archive the workspace. Which plugin should I use and what configuration I should make in that?
So it sounds like you have a couple things going on here. Here is some possible suggestions that I use to meet similar needs:
1.) If you are only wanting your job to build when there is a change in your source control, in this case GIT, you can use the "Poll SCM" plugin. And then in there set a cron expression to run every 10 minutes.
"Poll SCM" plugin will check source control for any changes and build the job when it finds them. If this works properly your job will not build thus it will not archive anything unnecessarily.
2.) For archiving I would make sure to utilize the "Discard Old Builds" plugin and "Advanced" section to keep a rotation and retention policy for your jobs artifacts.
3.) You state "for other jobs to clone this workspace". Are you actually having other jobs pull in this jobs workspace? Or did you mean copy its artifacts? I ask because the workspace is temporary, in a sense, and you should pull the artifacts. There is a plugin for that as well called "Copy Artifact Plugin" that you can use and it allows for various options.
4.) An alternative to "Poll SCM" plugin, if it doesn't work or you do not prefer this, depending on your GIT setup you could also potentially setup a hook that will notify Jenkins of changes. There are various hooks depending on the GIT implementation.
Hope this helps!

How to ensure same git checkout for build and deploy jobs in Jenkins?

In Jenkins, I have a "Build" job setup to poll my git repo and automatically build on change. Then, I have separate "Deploy to DEV", "Deploy to QA", etc. jobs that will call an Ant build that deploys appropriately. Currently, this configuration works great.
However, this process favors deploying the latest build on the latest development branch. I use the Copy Artifact plugin to allow the user to choose which build to deploy. Also, the Ant scripts for build/deploy are part of the repo and are subject to change. This means it's possible the artifact could be incompatible between versions. So, it's ideal that I ensure that the build and deploy jobs are run using the same git checkout.
Is there an easier way? It ought to be possible for the Deploy job to obtain the git checkout hash used from the selected build and checkout. However, I don't see any options or plugins that do this.
Any ideas on how to simplify this configuration?
You can use Parameterized Trigger Plugin to do this for you. The straight way is to prepare file with parameters as a build step and pass this parameters to the downstream job using the plugin. You can pass git revision as a parameter for example or other settings.
The details would vary for a Git repo (see https://stackoverflow.com/a/13117975/466874), but for our SVN-based jobs, what we do is have the build job (re)create an SVN tag (with a static name like "LatestSuccessfulBuild") at successful completion, and then we configure the deployment jobs to use that tag as their repo URL rather than the trunk location. This ensures that deployments are always of whatever revision was successfully built by the build job (meaning all unit tests passed, etc.) rather than allowing newer trunk commits to sneak into the deployment build.

Resources