jenkins plugin for triggering build whenever any file changed in a given directory - jenkins

I am looking for functionality where we have a directory with some files in it.
Whenever any one makes a change in any of the files in the directory, jenkins shoukd trigger a build.
Is there any plugin or mathod for this functionality. Please advise.
Thanks in advance.

I have not tried it myself, but The FSTrigger plugin seems to do what you want:
FSTrigger provides polling mechanisms to monitor a file system and
trigger a build if a file or a set of files have changed.
If you can monitor the directory with a script, you can trigger the build with a HTTP GET, for example with wget or curl:
wget -O- $JENKINS_URL/job/JOBNAME/build

Although slightly related.. it seems like this issue was about monitoring static files on system.. however there are many version control systems for just this purpose.
I answered this in another post if you're using git to track changes on the files themselves:
#!/bin/bash
set -e
job_name="whatever"
JOB_URL="http://myserver:8080/job/${job_name}/"
FILTER_PATH="path/to/folder/to/monitor"
python_func="import json, sys
obj = json.loads(sys.stdin.read())
ch_list = obj['changeSet']['items']
_list = [ j['affectedPaths'] for j in ch_list ]
for outer in _list:
for inner in outer:
print inner
"
_affected_files=`curl --silent ${JOB_URL}${BUILD_NUMBER}'/api/json' | python -c "$python_func"`
if [ -z "`echo \"$_affected_files\" | grep \"${FILTER_PATH}\"`" ]; then
echo "[INFO] no changes detected in ${FILTER_PATH}"
exit 0
else
echo "[INFO] changed files detected: "
for a_file in `echo "$_affected_files" | grep "${FILTER_PATH}"`; do
echo " $a_file"
done;
fi;
You can add the check directly to the top of the job's exec shell, and it will exit 0 if no changes detected.. Hence, you can always poll the top level of the repo for check-in's to trigger a build. And only complete a build if the files in question change.

Related

How can I create a file via an XCode run-script and add to project?

I want to display my git version on my app. From some research, it looks like I can accomplish this via a run-script.
This code below will create a file at Resources/GitVersion.swift
version=$(git rev-parse --verify HEAD | cut -c 1-10)
commitDate=$(git log -n 1 HEAD --pretty=format:"%h - %cd" | cut -c 12-)
filesource="//\n// GitVersion.swift\n//\n// Commit Date:$commitDate\n//\n\nlet gitVersion = \"$version\"\n"
cd ${SOURCE_ROOT}/${PROJECT_NAME}
echo "$filesource" > Resources/GitVersion.swift
touch Resources/GitVersion.swift
The file will look like: let gitVersion = XXX, and will update each time the code runs.
Great, except GitVersion.swift isn't in my project, so I can't reference gitVersion anywhere to access the git hash.
How can I add GitVersion.swift via a runscript to my project, such that every time I run my build, it creates the file and dynamically adds the dependency to the project?

Trigger specific job on push to specific directory

We have 12 different projects inside the same repository and have a different job to run for each of these.
I want to know how I can trigger a job only when a change has happened in a specific folder, since running all 12 on every push takes too long to finish.
Well I have hacked a solution that works for us.
First, add an Execute Shell Build Step:
#!/bin/bash
export DIRS="api objects"
DIFF=`git diff --name-only develop`
echo "export RUN_TEST=0" > "$WORKSPACE/RUN_TEST"
for DIR in $DIRS; do
for LINE in $DIFF; do
# Is this file inside an interesting directory?
echo $LINE | grep -e "^$DIR/"
# Checking if it is inside
if [ $? -eq 0 ]; then
echo "export RUN_TEST=1" > "$WORKSPACE/RUN_TEST"
fi
done
done
Here:
api and objects are the 2 directories I want to trigger this Job
develop is the main branch we use, so I want to know how my directories compare to that branch in particular
I create a file $WORKSPACE/RUN_TEST to set a variable if I should or not run it
Then in the time-consuming build steps add:
#!/bin/sh
. "$WORKSPACE/RUN_TEST"
if [ $RUN_TEST -eq 1 ]; then
# Time consuming code here
fi
That way the job is triggered but runs as fast as if it wasn't triggered.
Now I modified it to:
#!/bin/bash
export DIRS="api objects"
DIFF=`git diff --name-only origin/develop`
RUN_TEST=111
for DIR in $DIRS; do
for LINE in $DIFF; do
# Is this file inside an interesting directory?
echo $LINE | grep -e "^$DIR/"
# Checking if it is inside
if [ $? -eq 0 ]; then
RUN_TEST=0
fi
done
done
echo "RUN_TEST=$RUN_TEST"
echo "return $RUN_TEST" > "$WORKSPACE/RUN_TEST"
exit $RUN_TEST
And set Exit code to set build unstable to 111 on all build steps. Then, in all following build steps I did:
#!/bin/bash
# Exit on any error
set -euo pipefail
. "$WORKSPACE/RUN_TEST"
# Rest of build step

Create symlinks instead of copy with maven-dependency-plugin : copy-dependencies

I work on a Maven project that need to copy mode than 10 GB of artifacts in a target repository from a maven local repository (after downloaded them).
In some cases (e.g. for tests), I'd like to replace this copy by a symlink creation in order to save few minutes.
My question is: Is there a way to ask to plugin maven-dependency-plugin goal copy-dependencies to create a symlink OR is there any maven plugin that can do it.
The copy-dependencies goal cannot, to my knowledge, do this out of the box. However, you can use a shell script:
#!/bin/sh
outputDir=target/dependency
mkdir -p "$outputDir"
mvn dependency:resolve |
grep ':\(compile\|runtime\)' | sed 's/\[INFO\] *//' |
while read gav
do
case "$gav" in
*:*:*:*:*:*) # G:A:P:C:V:S
g="${gav%%:*}"; remain="${gav#*:}"
a="${remain%%:*}"; remain="${remain#*:}"
p="${remain%%:*}"; remain="${remain#*:}"
c="${remain%%:*}"; remain="${remain#*:}"
v="${remain%%:*}"
s="${remain#*:}"
;;
*:*:*:*:*) # G:A:P:V:S
g="${gav%%:*}"; remain="${gav#*:}"
a="${remain%%:*}"; remain="${remain#*:}"
p="${remain%%:*}"; remain="${remain#*:}"
c=""
v="${remain%%:*}"
s="${remain#*:}"
;;
esac
g=$(echo "$g" | sed 's/\./\//g')
test -n "$c" && artName="$a-$v-$c" || artName="$a-$v"
ln -s "$HOME/.m2/repository/$g/$a/$v/$artName.$p" "$outputDir"
done

In jenkins how can I see last time build ran, by his xml file?

I Have more than 1000 jobs in Jenkins,
And I would like to go through all of them in order to clean unused jobs.
What is the recommended way to do so?
I guess in every job "xml" file there is an indication to when it last ran,
Can anyone point me where this file is located?
I ended up filter the jobs by the "View job Filters" plugin,
You can use "Filter by Build Trend" option as follows:
Create a view for "All jobs" -> go to edit view -> in "add job filter " choose "Build Trend Filter" -> choose the filter you desire.
This is what I did:
I don't think you can do this in one step. But you can do this in 2 steps.
Find the URLs of all jobs with this:
https://jenkins-server/api/json?tree=jobs[url]
Get more info about each job by using the urls returned from step 1:
url-from-step1/api/json
This will give you the healthreport, last failed/successful build etc. If you need more info about these builds you can make a new request with :
url-from-step1/last-build-number/api/json
I recommend using JSON, and using JQ (http://stedolan.github.io/jq/, https://jqplay.org/) to parse your JSON
Happy coding!
You can leverage the REST API. The following urls might be relevant for you:
https://ci.jenkins-ci.org/api/xml?tree=jobs[name] -- to get a list of jobs
https://ci.jenkins-ci.org/job/{jobName}/lastBuild/buildTimestamp?format=yyyy-MM-dd-HH-mm-ss -- to get the time of last build of job {jobName}
Feel free to change xml to json/python...
I can provide a following shell script as a rough example:
#!/bin/bash
jenkinsUrlBase='https://ci.jenkins-ci.org'
callJenkins() {
curl --silent --show-error -g "$jenkinsUrlBase${1}"
}
callJenkins '/api/xml?tree=jobs[name]' | xmlstarlet sel -t -v '//hudson/job/name' | while read projectName ; do
timestamp=$(callJenkins "/job/${projectName}/lastBuild/buildTimestamp?format=yyyy-MM-dd-HH-mm-ss")
echo "Last build of ${projectName}: ${timestamp}"
done
You can exploit directory and file structure in ${JENKINS_HOME}:
cd ${JENKINS_HOME}/jobs/${JOB_NAME}/builds
ls -lt | head -2 | tail -1 | awk '{print $9}'
Example output:
2015-08-13_11-48-25

Watch a web page for changes

I googled and couldn't find any could that would compare a webpage to a previous version.
In this case the page I'm trying to watch is link text. There are services that can watch a page, but I'd like to set this up on my own server.
I've set this up as a wiki so anyone can add to the code. Here's my idea
Check if previous version of file exists. If false then download page
If page exists, diff to find differences and email the new content along with dates of new and old versions.
This script would be called nightly via cron or on-demand via the browser (the latter is not a priority)
Sounds simple, maybe I'm just not looking in the right place.
Perhaps a simple sh-script like this, featuring wget, diff & test?
#!/bin/sh
WWWURI="http://foo.bar/testfile.html"
LOCALCOPY="testfile.html"
TMPFILE="tmpfile"
WEBFILE="changed.html"
MAILADDRESS="$(whoami)"
SUBJECT_NEWFILE="$LOCALCOPY is new"
BODY_NEWFILE="first version of $LOCALCOPY loaded"
SUBJECT_CHANGEDFILE="$LOCALCOPY updated"
SUBJECT_NOTCHANGED="$LOCALCOPY not updated"
BODY_CHANGEDFILE="new version of $LOCALCOPY"
# test for old file
if [ -e "$LOCALCOPY" ]
then
mv "$LOCALCOPY" "$LOCALCOPY.bak"
wget "$WWWURI" -O"$LOCALCOPY" -o/dev/null
diff "$LOCALCOPY" "$LOCALCOPY.bak" > $TMPFILE
# test for update
if [ -s "$TMPFILE" ]
then
echo "$SUBJECT_CHANGEDFILE"
( echo "$BODY_CHANGEDFILE" ; cat "$TMPFILE" ) | tee "$WEBFILE" | mail -s "$SUBJECT_CHANGEDFILE" "$MAILADDRESS"
else
echo "$SUBJECT_NOTCHANGED"
fi
else
wget "$WWWURI" -O"$LOCALCOPY" -o/dev/null
echo "$BODY_NEWFILE"
echo "$BODY_NEWFILE" | tee "$WEBFILE" | mail -s "$SUBJECT_NEWFILE" "$MAILADDRESS"
fi
[ -e "$TMPFILE" ] && rm "$TMPFILE"
Update: Pipe through tee, little spelling & remove of $TMPFILE
You can check This SO posting to get a few ideas and also information about the challenge of detecting "true" changes to a web page (with fluctuating advertisement block, and other "noise")

Resources