Use Jenkins to compare files in two nodes - jenkins

I wonder are there features for jenkins to capture the result /data in a node and persist it in master.
I come up with the scenario that I need to check some folders in two machines to see whether they have same no of files & same size.
If hudson can save some result like "ls -ltR" in master , then I can gather at both node the results in two jobs then compare.
Are there any elegant solution to this simple problem?
currently I can connect two machines to each other via SSH and solve the problem, while this connection is not always available.
(With SSH I believe the best way is to use rsync -an /path/to/ hostB:/path/to/)

Simple problem, only slightly elegant solution :
Write a simple job listdir which does DIR > C:\logs\list1.txt .. list
Go to Post-build Actions
Add Archive the artifacts for example from above: C:\logs\*.*
Now run a build and go to http://jenkinsservername:8080/job/listdir/
You'll see the list1.txt which you can click on, and see the contents.
I have given a Windows example, you can of course replace DIR with ls -ltr

Or use archive artifacts in combination with the Copy Artifacts Plugin to pull the result of another job in the job where the comparison shall be done.

Related

Jenkins pipeline -how to read file from outside workspace

i have a script that should run on both linux and windows agents.
this script reads a config file sitting on a network drive.
it gets worse - we have 2 different jenkins masters - one on docker ubuntu, and one on master. they run different jobs but with the same script.
so now -
using script.readFile is out of the question because the file is outside of workspace.
using groovy File(path).text is also problematic because the path (the mounts) is different on windows/linux (the jenkins masters).
There is a shared env var across all machines to get the right mount. when using groovy File, this doesn't work "${SOME_ENV_VAR}/file" it doesn't translate the env var
is there a way to use jenkins pipeline to read a file outside workspace? this would be the best solution.
or some other solution you can think of?
Thanks
using script.readFile is out of the question because the file is outside of workspace.
Not really. Assuming you are referring to the Jenkins step readFile you still can use it. It just takes a whole lot of dots
def config = readFile "../../../../mnt/config/my_config.txt
You'd have to figure out the exact amount of dots yourself

How to delete a build from Jenkins job workspace

I wonder if it is possible to remove only one build (including artifacts) from job workspace.
I tried to "Delete Build" in Build History but all it does is remove build reference from Build History table. I know I can ssh to a server and delete files from the command line but I am looking for a way to do it from Jenkins web interface.
After installing Workspace Cleanup Plugin I am able to wipe out current workspace but I want to keep my other builds in the workspace.
In your Jenkins instance, to be able to have folder/per build - set flag "Use custom workspace" in your job's settings. Here is a brief help info from the setting description:
For each job on Jenkins, Jenkins allocates a unique "workspace directory."
This is the directory where the code is checked out and builds happen.
Normally you should let Jenkins allocate and clean up workspace directories,
but in several situations this is problematic, and in such case, this option
lets you specify the workspace location manually.
One such situation is where paths are hard-coded and the code needs to be
built on a specific location. While there's no doubt that such a build is
not ideal, this option allows you to get going in such a situation.
...
And your custom directory path would look like this:
workspace\$JOB_NAME\$BUILD_NUMBER ~> workspace\my-job-name\123
where $JOB_NAME will be "my-job-name" and $BUILD_NUMBER is the build number, eq. "123".
There is one nasty problem with this approach and this is why I wouldn't recommend to use it - Jenkins will not be able to reclaim disk space for outdated builds. You would have to handle cleanup of outdated builds manually and it is a lot of hassle.
Alternative approach, that gives you more control, tools and is able to keep disk space usage under control (without your supervision) is to use default workspace settings and archive your build output (files, original source code, libraries and etc.) as a post-build action. Very-very handy and gives you access to a whole bunch of great tools like, Copy Artifact Plugin or ArtifactDeployer Plugin in other jobs.
Hope that info helps you make a decision that fits your needs best.
I also use "General/Advanced/Use custom workspace" (as in #pabloduo's answer) on a Windows machine with something like:
C:\${JOB_NAME}\${BUILD_NUMBER}
Just wanted to add a solution for getting rid of the build job's workspaces.
I use Groovy Events Listener Plugin for this.
Using the plug-in's standard configuration I just use the following Groovy script:
if (event == Event.JOB_DELETED){
new File(env.WORKSPACE).deleteDir()
}
And now the custom workspace is deleted when the build job is deleted.
Just be aware that this would also delete non-custom workspaces (because the event is triggered for all jobs on your Jenkins server).

Run single PS script from Release definition without pulling down entire project

In my release definition, I want to run a single PS script which lives in source control (TFVC in my case). I don't see a way to do this without TFS pulling down the entire source tree containing the one script on the agent machine. I currently have an unversioned copy of the script out on the agent machine, and I reference its absolute path from the release definition. This works, but I'm not guaranteed the latest version of this script to be run at release time.
You have at least two way to do it:
define a mapping that picks only what you need -- you can define a mapping up to a single file, e.g. cloak $/ and map $/path_to_my_file
use a dummy build that collects the file you need and save them as artifacts, I explained this technique in http://blog.casavian.eu/2017/03/04/mixing-tfvc-and-git/

How can I use Jenkins to detect the presence of a file on an SFTP server?

I want to use Jenkins to monitor an SFTP site, and fire off a build when a given file type appears on that site.
How can I do this?
Ok, in that case you should have 2 jobs.
First job - running every N minutes with a bash script as a build step:
`wget ftp://login:password#ftp.example.org/file.txt
Then you should have https://wiki.jenkins-ci.org/display/JENKINS/Run+Condition+Plugin , which runs on condition when file "file.txt" (downloaded or not downloaded previously) exist
Afte that you can trigger your next job in case if file exist (or do anything else)
Like in the previous answer I would use two jobs but instead of bash scripts I would use python and and sftp since it makes dealing with ssh a bit easier.
https://pypi.python.org/pypi/pysftp

rsync alternative to Jenkins Copy Artifacts plugin?

I'm working on a set of builds related to our online images (such as wordpress content). Overall it's a large workflow, so it's broken into several jobs.
A couple jobs need to copy a large number of artifacts from other jobs; I've been using the Copy Artifacts plugin but it's too slow for my case and rsync would be much better suited.
Is it possible to effectively get the source artifact directory for the upstream build so that I can pass it to rsync in place of using the Copy Artifacts plugin? I'd like to have a simple script like:
rsync -a --delete $UPSTREAM_ARTIFACT_DIR $WORKSPACE
The upstream artifacts are accessible via what appear to be well defined URLs. For example,
the following URL enables one to access the last good builds' artifacts:
http://jenkins/job/job_name/lastSuccessfulBuild/artifact/
Can even specify the axis on a multiconfiguration project, if that's required:
http://jenkins/job/job_name/label=foo,arch=x86/lastSuccessfulBuild/artifact/

Resources