Jenkins Exclude Regions with Git needs workspace (trigger a build unnecessarily) - jenkins

We have a git repo that has the following directory structure:
src/
test/
terraform/
Whenever someone does a commit to the repo, git sends a message to Jenkins to trigger a build. Perfect!
Except we do not want to trigger a build if someone does a commit to the terraform directory, so we've added an "excluded regions":
Everything works great, except we have dynamic executors, and when there are no executors, a build gets triggered even if someone does a commit to the terraform directory.
Basically, git needs to trigger a build to get a workspace. Kinda defeats the whole purpose:
Scheduling a new build to get a workspace. (nonexisting_workspace)
This bug is reported as "functions as designed":
https://issues.jenkins-ci.org/browse/JENKINS-18079
Is there any way to use "Excluded Region" in Jenkins if we are using dynamic executors?

Related

Trigger a jenkins build based on update of specific folder inside git repository branch

I am trying to find a solution for triggering a jenkins build based on update of specific folder inside git repository of a branch. My git repository root directory has a folder called "changers" . So if I change anything inside folder "changers" . I need to trigger the build.
Note that I am using multibranch pipeline.
Also I have tried with "changeset" option in jenkins to be use at groovy code. But it didn't work for me.
Thanks

I have gitlab repository, under one repository I have many projects. Now I want to intergrate gitlab with Jenkins

I have gitlab repository, under one repository I have many projects. Now I want to intergrate gitlab with Jenkins . So normllay when ever a commit happen in git repo the solution file in the repository should build.
But in my case if I integrate with Jenkins I have many solutions files in same repository.so I want to trigger only the solution file which is related to the commited file.
Is their any way by using webhook (or any way ) to know what folder of the repository is modified ? So that I can run related solution file based on the path
Folder structure:
Project1/project1.sln and dependent files
Project2/project2.sln and dependent files
Thanks in advance.and also please guide for githab webhook configuration to integrate with Jenkins .
GitLab Webhooks can be created in the GitLab Web-Interface. The modified files are listed in the body of the Push-Message.
Solution with the Git Plugin
Create a Freestyle Job for one solution file e.g. Project1
Go to Build Trigges
Select Poll SCM.
Leave Schedule empty
Go to Source-Code-Management
Select Git
Configure Repository URL and Credentials
Enter Branch e.g. refs/heads/master
Select Repository Browser (AUTO)
At Additional Behaviours press Add Button the bottom
Select Polling ignores commits in certain Paths
Fill Included Regions with the path to the project folder e.g. ^Project1/*.. That means it will only trigger the build if changes in that folder or it's subfolders are detected.
Go to Post-Build Actions at the bottom of the job configuration and press Add Button
Select Clean Workspace after Build
Add Patterns for files to be deleted press Add Button
Select Exclude
Enter **/.git/*
Select Checkbox Apply pattern also on directories
When GitLab sends a Push Message, Jenkins will trigger SCM polling of this job. It will verify if there are changes in the related folder. Build will start if this is the case.
You can check the git polling protocol at /job//scmPollLog/
Solution with the GitLab Plugin
The Jenkins GitLab Plugin would also be a solution.

How to run Jenkins pipeline automatically when "git push" happens for specific folder in bitbucket

I have started using Jenkins recently and there is only scenario where I am stuck. I need to run "Jenkins pipeline" automatically when git push happens for specific folder in master branch. Only if something is added to specific folder, than pipeline should run.
I have already tried SCM with Sparse checkout path, and mentioned my folder, but that's not working.
I am using GUI free style project, I dont know groovy.
I had the same issue and I resolved it by configuring the Git poll.
I used poll SCM to trigger a build and I used the additional behavior of Jenkins Git plugin named "Polling ignores commits in certain paths" > "Included Regions" : my_specific_folder/.*
By the way, using Sparse checkout path allows jenkins to checkout only the folder you mentioned.

Get Git commit from "upstream" build in manually triggered Jenkins job

I have a Build job in Jenkins that checks out a specific Git commit and packages it for deployment as artifacts.
There is a later Deployment job that takes the built artifacts and actually deploys the code. It also does a sparse Git checkout of a specific directory containing deployment scripts. After successful completion, we write a Git tag.
The problem is that the tag is being written to the HEAD of master, not to the hash of the commit used for the original upstream build. (master is the branch defined in the job configuration.)
Is there a way to get the upstream SCM information if it's not passed directly through a parameterized trigger? I can see commits listed in the build.xml file that Jenkins generates in the build directory; is there a way to read this information from the downstream job?
I realize that it's not really "downstream", since it's manually triggered. We do have a selector that defines UPSTREAM_BUILD and UPSTREAM_PROJECT, though.
If you are using the Copy Artifact plugin, you could write a file with the commit hash during the Build job and read it back in during the Deployment job:
# Build
echo ${GIT_COMMIT} > COMMIT_HASH
# Deployment, after copying COMMIT_HASH into the workspace
git checkout $(cat COMMIT_HASH)

How to ensure same git checkout for build and deploy jobs in Jenkins?

In Jenkins, I have a "Build" job setup to poll my git repo and automatically build on change. Then, I have separate "Deploy to DEV", "Deploy to QA", etc. jobs that will call an Ant build that deploys appropriately. Currently, this configuration works great.
However, this process favors deploying the latest build on the latest development branch. I use the Copy Artifact plugin to allow the user to choose which build to deploy. Also, the Ant scripts for build/deploy are part of the repo and are subject to change. This means it's possible the artifact could be incompatible between versions. So, it's ideal that I ensure that the build and deploy jobs are run using the same git checkout.
Is there an easier way? It ought to be possible for the Deploy job to obtain the git checkout hash used from the selected build and checkout. However, I don't see any options or plugins that do this.
Any ideas on how to simplify this configuration?
You can use Parameterized Trigger Plugin to do this for you. The straight way is to prepare file with parameters as a build step and pass this parameters to the downstream job using the plugin. You can pass git revision as a parameter for example or other settings.
The details would vary for a Git repo (see https://stackoverflow.com/a/13117975/466874), but for our SVN-based jobs, what we do is have the build job (re)create an SVN tag (with a static name like "LatestSuccessfulBuild") at successful completion, and then we configure the deployment jobs to use that tag as their repo URL rather than the trunk location. This ensures that deployments are always of whatever revision was successfully built by the build job (meaning all unit tests passed, etc.) rather than allowing newer trunk commits to sneak into the deployment build.

Resources