Jenkins is checking out the entire SVN repo twice - jenkins

I have a Jenkins Pipeline setup, and a Jenkins file which has the below content:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Hey'
}
}
}
}
A post commit hook triggers the Jenkins build successfully, and I can see it starting from the Jenkins UI. It states that it is checking out the repo to read from the Jenkins file and it stores the checkout in the workspace#script folder on the server.
Checking out svn https://<svn_server>/svn/Test/Core into C:\Program Files (x86)\Jenkins\jobs\CI_Build\workspace#script to read JenkinsPipeline/Jenkinsfile
Checking out a fresh workspace because C:\Program Files (x86)\Jenkins\jobs\CI_Build\workspace#script doesn't exist
Cleaning local Directory .
After this is complete, I make a change to a file in the repo and the build triggers via the post commit hook happily, but then it tries to checkout the entire code base again into a folder called workspace. I would have expected that the checkout happens once and then the "use SVN update as much as possible" option would kick in and only update the changed files? Or maybe I have the wrong logic?
SVN version - 1.9.7
Jenkins version - 2.84

Jenkins has to know what is in your pipeline script before it knows if it should checkout your code. It is possible that your pipeline says not to check out the code, and you set into a subdirectory and fire off the checkout yourself. Or maybe checkout multiple repos in different places. Until Jenkins sees your Jenkinsfile, it can't know what you want. So it has to checkout the repo once to see your pipeline, then again to do the work.
With git (and maybe some versions of other repo plugins) lightweight or sparse checkouts are supported, so then it only grabs the jenkinsfile instead of the entire repo. I don't think this is a supported option in SVN yet.

Lightweight checkouts are now supported by the SVN plugin, the SVN plugin was updated in version 2.12.0 to add this feature - see https://wiki.jenkins.io/display/JENKINS/Subversion+Plugin.

Related

How to run Jenkins pipeline automatically when "git push" happens for specific folder in bitbucket

I have started using Jenkins recently and there is only scenario where I am stuck. I need to run "Jenkins pipeline" automatically when git push happens for specific folder in master branch. Only if something is added to specific folder, than pipeline should run.
I have already tried SCM with Sparse checkout path, and mentioned my folder, but that's not working.
I am using GUI free style project, I dont know groovy.
I had the same issue and I resolved it by configuring the Git poll.
I used poll SCM to trigger a build and I used the additional behavior of Jenkins Git plugin named "Polling ignores commits in certain paths" > "Included Regions" : my_specific_folder/.*
By the way, using Sparse checkout path allows jenkins to checkout only the folder you mentioned.

Jenkins Scripted Pipeline to trigger from changes in git directory

I need code to jenkins scripted pipeline where I can say that pipeline is only allowed to trigger from changes in git directory and not from changes in whole repository to avoid useless triggers.
Thanks!
I understand that you want your pipeline to run only when changes are detected for a specific sub directory in your Git repository, but not for changes only affecting files outside that directory.
Theoretically, the PathRestriction class that could be used with the checkout step would offer this through includedRegions and excludedRegions.
Unfortunately, this is not working for Git in Jenkins Pipeline, see JENKINS-36195.

Get Git commit from "upstream" build in manually triggered Jenkins job

I have a Build job in Jenkins that checks out a specific Git commit and packages it for deployment as artifacts.
There is a later Deployment job that takes the built artifacts and actually deploys the code. It also does a sparse Git checkout of a specific directory containing deployment scripts. After successful completion, we write a Git tag.
The problem is that the tag is being written to the HEAD of master, not to the hash of the commit used for the original upstream build. (master is the branch defined in the job configuration.)
Is there a way to get the upstream SCM information if it's not passed directly through a parameterized trigger? I can see commits listed in the build.xml file that Jenkins generates in the build directory; is there a way to read this information from the downstream job?
I realize that it's not really "downstream", since it's manually triggered. We do have a selector that defines UPSTREAM_BUILD and UPSTREAM_PROJECT, though.
If you are using the Copy Artifact plugin, you could write a file with the commit hash during the Build job and read it back in during the Deployment job:
# Build
echo ${GIT_COMMIT} > COMMIT_HASH
# Deployment, after copying COMMIT_HASH into the workspace
git checkout $(cat COMMIT_HASH)

How can I programmatically obtain the hash of latest built Git revision of a Jenkins project?

I am accessing the project from a Jenkins plugin, so I have access to an instance of hudson.model.Project. I know that Git is the used SCM. Is there a nice (non-hacky) way to access the last built revision?
Some details:
I am not interested in success or failure of the build, it's enough that the build was started.
"Revision": I know the Git URL and branch already, the hash of the revision that was or will be checked out for the build is needed.
I know that Git plugin sets the environment variable "GIT_COMMIT". I consider this as one of the hacky options.
If you are cloning/accessing the repository on some build step - you may want to use
COMMIT=`git rev-parse HEAD`
during the build and for example export this variable to build description or echo it into a file and archive it as a build artifact.

How to ensure same git checkout for build and deploy jobs in Jenkins?

In Jenkins, I have a "Build" job setup to poll my git repo and automatically build on change. Then, I have separate "Deploy to DEV", "Deploy to QA", etc. jobs that will call an Ant build that deploys appropriately. Currently, this configuration works great.
However, this process favors deploying the latest build on the latest development branch. I use the Copy Artifact plugin to allow the user to choose which build to deploy. Also, the Ant scripts for build/deploy are part of the repo and are subject to change. This means it's possible the artifact could be incompatible between versions. So, it's ideal that I ensure that the build and deploy jobs are run using the same git checkout.
Is there an easier way? It ought to be possible for the Deploy job to obtain the git checkout hash used from the selected build and checkout. However, I don't see any options or plugins that do this.
Any ideas on how to simplify this configuration?
You can use Parameterized Trigger Plugin to do this for you. The straight way is to prepare file with parameters as a build step and pass this parameters to the downstream job using the plugin. You can pass git revision as a parameter for example or other settings.
The details would vary for a Git repo (see https://stackoverflow.com/a/13117975/466874), but for our SVN-based jobs, what we do is have the build job (re)create an SVN tag (with a static name like "LatestSuccessfulBuild") at successful completion, and then we configure the deployment jobs to use that tag as their repo URL rather than the trunk location. This ensures that deployments are always of whatever revision was successfully built by the build job (meaning all unit tests passed, etc.) rather than allowing newer trunk commits to sneak into the deployment build.

Resources