How can I use Jenkins to detect the presence of a file on an SFTP server? - jenkins

I want to use Jenkins to monitor an SFTP site, and fire off a build when a given file type appears on that site.
How can I do this?

Ok, in that case you should have 2 jobs.
First job - running every N minutes with a bash script as a build step:
`wget ftp://login:password#ftp.example.org/file.txt
Then you should have https://wiki.jenkins-ci.org/display/JENKINS/Run+Condition+Plugin , which runs on condition when file "file.txt" (downloaded or not downloaded previously) exist
Afte that you can trigger your next job in case if file exist (or do anything else)

Like in the previous answer I would use two jobs but instead of bash scripts I would use python and and sftp since it makes dealing with ssh a bit easier.
https://pypi.python.org/pypi/pysftp

Related

How send folder as attachments in Jenkins Job Email Notification

I have A Jenkins Job with Workspace C:\hello_world\test_output*
in the test output folder, 2 things are one folder and one HTML file I want to send test output folder as a zip file as attachments on Jenkin jobs but I can't able to do it, please help
Think of it as two steps: 1) zipping the files; and 2) sending the attachment.
I've done this by installing 7zip, then running the command:
"C:\Program Files\7-Zip\7z.exe" a -r C:\hellow_world\test_output.zip C:\job\test_output\* - mem=AES256
With the https://plugins.jenkins.io/email-ext plugin installed, there's a lot of flexibility, including the ability to send attachments.
Bear in mind that some mail hosts example GMAIL have started blocking things like executables, even when found within zip files. If you've got users on such a host, you might run into trouble through no fault of your own.
Apart from that, depending on the OS that Jenkins is running on, you could add a post-build Execute shell or Execute Windows Batch command step that calls the zip tool of your choice, and send an e-mail with attachments using the email-ext plugin for example

How to use a GitLab link for applying jenkins.yml file for the concept of Jenkins Configuration as Code

I have a local instance of Jenkins. I have previously tried storing the jenkins.yml in my system and giving its path on http://localhost:8080/configuration-as-code. This worked but I want to use a Gitlab repository to store the jenkins.yml file.
I have already tried giving the gitlab link of my jenkins.yml in the path or URL textbox. Some weird things happened, like
1. jenkins broke or huge error console
2. It reapplies the previous configuration(from system path)
jenkins:
systemMessage: "Hello, world"
Your problem as described: you want the job configuration to be saved in GIT and, when a build is triggered, the job should get the current stand of its configuration from there and then, run the build.
Maybe there is a kind of plug-in that does it for you, but I am not aware of any. Maybe anyone?
My suggestion is to define a pipeline job and use a declarative pipeline. It is a file, normally named Jenkinsfile that can be stored in GIT. In the Job, you define the GIT address and when you trigger a build, the file is got from GIT and executed.
There are several flaws in this: pipelines learning curve is not small, you are confronted with groovy (not XML!) and your current XML file is barelly useful.
Maybe someone shows up and tells us about new (for me) plugin that solves your problem using the configuration XML file. In the other hand, pipelines are such a beautyful feature that I encourage you to give it a try

Jenkins - kill process before delete workspace action starts

I am having a Jenkins job that runs Nunit tests on remote machine.
I am using Jenkins's Workspace Cleanup Plugin pluggin (https://wiki.jenkins-ci.org/display/JENKINS/Workspace+Cleanup+Plugin) to clean my workspace.
the problem is that I want to task kill some process on my machine (because otherwise I could not delete the workspace - some files will be in use and threfore could not be deleted) and I want to do it before the delete action takes place (it is always the first action on the job).
I know that there is an option in the pluggin- "External Deletion Command" - but this runs the command on all the files in the workspace where as I need it to run only once (not on a the sepsific workspace files - i.e. only this command: "c:/workspace/taskill nunit")
is there a way to do so?
Thanks
If I can suggest a different approach to use an app called LockHunter which has an API to unlock and delete your workspace. It's much more "sergical" than removing a random task and hope it's the one you meant to.
You can trigger it from command line using "run before SCM" and it'll handle the deletion and unblocking of your specific workspace.
You can also use:
"cmd /c wmic /INTERACTIVE:OFF Path win32_process Where \\"CommandLine Like '%workspace%'\\" call terminate"
Where %workspace% is your current workspace. This will go over all the tasks that are currently running and check the command line path, then it'll call terminate for anything it found.
Good luck!

in jenkins, how to copy artifacts from another server?

I have another project from which I need to copy artifacts.
However the problem I have is that it's from another server. Is there a way to do so with the copy artifact or I'll have to go through code?
You can accomplish by either publishing your artifact and using either file transfer or secure shell.
Here is info to read upon:
Jenkins Secure Shell Plugin
Jenkins FTP Plugin
The only other possibility is to modify the ant or maven project config file.
Here is a More Reference along the same lines.
I used a wget to fetch the file in the end, with fixed paths.
This link can help for someone not used with wget.
Using wget to recursively fetch a directory with arbitrary files in it
For a long time I use this python script to download artifacts from Jenkins. It takes advantage of the JSON API layer available to any Jenkins job. The format of that API call is:
http://_YOUR_BUILD_HOST_/job/_JOBNAME_/lastSuccessfulBuild/api/json
Beware script depends on PyCurl.
Publish over ssh plugin can also be used for copying the files/artifacts from one server (local/linux) to another server. It has retries option also in case there is network issue and no. of retires and timeout also can be configured.

Use Jenkins to compare files in two nodes

I wonder are there features for jenkins to capture the result /data in a node and persist it in master.
I come up with the scenario that I need to check some folders in two machines to see whether they have same no of files & same size.
If hudson can save some result like "ls -ltR" in master , then I can gather at both node the results in two jobs then compare.
Are there any elegant solution to this simple problem?
currently I can connect two machines to each other via SSH and solve the problem, while this connection is not always available.
(With SSH I believe the best way is to use rsync -an /path/to/ hostB:/path/to/)
Simple problem, only slightly elegant solution :
Write a simple job listdir which does DIR > C:\logs\list1.txt .. list
Go to Post-build Actions
Add Archive the artifacts for example from above: C:\logs\*.*
Now run a build and go to http://jenkinsservername:8080/job/listdir/
You'll see the list1.txt which you can click on, and see the contents.
I have given a Windows example, you can of course replace DIR with ls -ltr
Or use archive artifacts in combination with the Copy Artifacts Plugin to pull the result of another job in the job where the comparison shall be done.

Resources