Jenkins 0 files published after build - jenkins

I have a Jenkins server setup with two jobs
first job polls the develop branch and builds the project on the jenkins server. i then have another job that polls the production branch this builds this branch on another jenkins slave which is the staging server. This job is configured so that on a successful build it should publish the artefacts over ssh to the production server.
All the SSh keys are setup and the staging server connects to production server but 0 files are transferred
using GIT_SSH to set credentials Bitbucket Repo
using .gitcredentials to set credentials
Checking out Revision 89874cc01a9f669df69817b1049b1ab98ecb19d3 (origin/Production)
SSH: Connecting from host [nginx-php-fastcgi]
SSH: Connecting with configuration [AmazonAWS] ...
SSH: Disconnecting configuration [AmazonAWS] ...
SSH: Transferred 0 file(s)
Finished: SUCCESS
I checked the staging workspace and files are being built there, just not sent to the prod server. Any suggestions??
i have also tried a different remove prefix as suggested bellow and here Jenkins transferring 0 files using publish over SSH plugin

You should remove /* from the Remove prefix line
Edit:
Your Source files cannot be outside of the job's workspace. If your files are in the root of workspace, just set it to * to transfer all workspace files, or **/* to include subdirectories. Else specify a pattern relative to ${WORKSPACE}.
Even adding a leading / will not escape that, as all it does is append that to workspace, in your case it becomes ${WORKSPACE}/var/www/workspace/opms-staging-server. Even using parent directory ../ will not work. This is for security concerns, else a job configurer could transfer private files off the Jenkins server.
If you need to get files from another job, you need to use Copy Artifacts build step. Tell me if that's your case, and I will explain further.

Related

Copy files from Bitbucket via Jenkins to a production server

I have got my code files in bitbucket and have configured a jenkins build job to run where there is a change in the bitbucket repository. At the end of which it has to copy the files from the repo to a directory located at a production server from where the application is running.
Is there a away to copy the files from repo to a server using a script inputed to jenkins?
I asume you have the files in the workspace of the job. How about copying the files via the command line? If you want to do so, insert a batch block for windows nodes or a shell block for linux nodes and use
cp original_file new_file
You have 2 possibilites:
Run a slave on the production server
In this case you run a slave on the production server, which connects to your master jenkins. The slave has to be run under a user, which is able to write the directory, where you want to copy the files too.
2 Variations of this possibility:
You can execute the clone (checkout) of the bitbucket repository on the master and then use stash to make the files accessible on the slave, running on the production server (https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#stash-stash-some-files-to-be-used-later-in-the-build).
You run the whole pipeline on the slave, which is running on the production server, which means the production server needs access to bitbucket.
There are several possibilities to connect a slave to a master: https://wiki.jenkins.io/display/JENKINS/Distributed+builds#Distributedbuilds-Differentwaysofstartingagents
Use remote copy possibilites
You copy the files with eg. scp in linux.
This has some security implications:
You have to add the password of the production to the jenkins credential store and pass it to the copy command
if using keys (recommended). You have to add the private key to the jenkins credential store and pass it to the command.

Jenkins Slave build locations

I have just added a slave to my Jenkins build - with the idea that I can now deploy artefacts to either my dev server or my test server.
However i've now hit a problem.
When I deploy a job on the master slave, the job build directory is
$JENKINS_HOME/localmoduledirectory (as defined in the build job)
However when I deploy my job via the slave the build directory is different which breaks my jobs. The build directory is
$JENKINS_HOME/workspace/build job title/localmoduledirectory
I know I can change the workspace root directory location for the master under configure settings /advances .. so can change it to $JENKINS_HOME/workspace, but I want to stop the slave using the build job title in the path.
The end result I'm after is to have jenkins, building / deploying from the same location on two servers i.e /opt/jenkins/workspace/localmoduledirectory.
Any ideas ?
ok after lots of head scratching ...
managed to discover that the mvn plugin has a custom workspace option hidden under advanced. so configured all jobs with a customer workspace of /opt/jenkins.

Promoting the affected file to remote server

I work on web app.
The file in my development environment that contains changes are pushed to perforce.
In order to deploy the development changes to qa server, I used to manually copy the affected files(perforce commits) from development server to qa server.
Now, I am planning to use Jenkins. I am using Jenkins plugin to hook up perforce changes. Whenever Jenkins build is triggered, all the perforce changes are detected by Jenkins job. As soon as the build runs, I want to run a shell script that copies the affected files of perforce to qa server.
I went through the docs, but there is no way to fetch the affected files of the build.
Can you recommend me the way to copy the affected files of build to qa server?
Do you use Perforce replication at all? Perhaps a 'Build Farm' or 'On-Demand' type replica server would suit your needs and would thus copy the changes over for you.
Otherwise, on a Unix OS a command like 'rsync' can be used to copy files from one location to another.
REFERENCES
Admin Guide - Perforce Replication
http://www.perforce.com/perforce/doc.current/manuals/p4dist/chapter.replication.html
http://answers.perforce.com/articles/KB_Article/On-Demand-Replication

How to copy file from remote host to jenkins server

I am using jenkins for build process. I am running some scripts on remote server using jenkins server. It's fine but remote host generates some html file. I want to copy that file back to jenkins server. Is it possible to do it from jenkins server?
If you want to archive it permanently, you can use the Archive Artifacts option in the Post-build step in Jenkins. In the case of builds that happen on slaves, Archive Artifacts copies the artifacts back to the Jenkins server for archiving and reuse.
If you want to then use this in a subsequent build, you can use the Copy Artifacts step to introduce an artifact from another build into your subsequent process on any Jenkins slave.
We use this to move production builds into our test environment after packaging and it works great.
Copy to slave plugin also copies files back from slaves to the master. Might be worth the look.

How to move Jenkins from one PC to another

I am currently using Jenkins on my development PC. I installed it on my development PC, because I had limited knowledge on this tool; so I tested on it in my development PC. Now, I feel comfortable with Jenkins as my long term "partner" in the build process and would like to "move" this Jenkins to a dedicated server.
Before this I have done few builds and have the artifacts archived from each build. In particular, the build number is very important to me for version control.
How can I export all the Jenkins information from my current PC to my new server?
Following the Jenkins wiki, you'll have to:
Install a fresh Jenkins instance on the new server
Be sure the old and the new Jenkins instances are stopped
Archive all the content of the JENKINS_HOME of the old Jenkins instance
Extract the archive into the new JENKINS_HOME directory
Do not forget to change the owner of the new Jenkins files : chown -R jenkins:jenkins $JENKINS_HOME
Launch the new Jenkins instance
Do not forget to change documentation/links to your new instance of Jenkins :)
JENKINS_HOME is by default located in ~/.jenkins on a Linux installation, yet to exactly find where it is located, go on the http://your_jenkins_url/configure page and check the value of the first parameter: Home directory; this is the JENKINS_HOME.
In case your JENKINS_HOME directory is too large to copy, and all you need is to set up same jobs, Jenkins Plugins and Jenkins configurations (and don't need old Job artifacts and reports), then you can use the ThinBackup Plugin:
Install ThinBackup on both the source and the target Jenkins servers
Configure the backup directory on both (in Manage Jenkins → ThinBackup → Settings)
On the source Jenkins, go to ThinBackup → Backup Now
Copy from Jenkins source backup directory to the Jenkins target backup directory
On the target Jenkins, go to ThinBackup → Restore, and then restart the Jenkins service.
If some plugins or jobs are missing, copy the backup content directly to the target JENKINS_HOME.
If you had user authentication on the source Jenkins, and now locked out on the target Jenkins, then edit Jenkins config.xml, set <useSecurity> to false, and restart Jenkins.
This worked for me to move from Ubuntu 12.04 (Jenkins ver. 1.628) to Ubuntu 16.04 (Jenkins ver. 1.651.2). I first installed Jenkins from the repositories.
Stop both Jenkins servers
Copy JENKINS_HOME (e.g. /var/lib/jenkins) from the old server to the new one. From a console in the new server:
rsync -av username#old-server-IP:/var/lib/jenkins/ /var/lib/jenkins/
Start your new Jenkins server
You might not need this, but I had to
Manage Jenkins and Reload Configuration from Disk.
Disconnect and connect all the nodes again.
Check that in the Configure System > Jenkins Location, the Jenkins URL is correctly assigned to the new Jenkins server.
Jenkins Server Automation:
Step 1:
Set up a repository to store the Jenkins home (jobs, configurations, plugins, etc.) in a GitLab local or on GitHub private repository and keep it updated regularly by pushing any new changes to Jenkins jobs, plugins, etc.
Step 2:
Configure a Puppet host-group/role for Jenkins that can be used to spin up new Jenkins servers. Do all the basic configuration in a Puppet recipe and make sure it installs the latest version of Jenkins and sets up a separate directory/mount for JENKINS_HOME.
Step 3:
Spin up a new machine using the Jenkins-puppet configuration above. When everything is installed, grab/clone the Jenkins configuration from the Git repository to the Jenkins home direcotry and restart Jenkins.
Step 4:
Go to the Jenkins URL, Manage Jenkins → Manage Plugins and update all the plugins that require an update.
Done
You can use Docker Swarm or Kubernetes to auto-scale the slave nodes.
Sometimes we may not have access to a Jenkins machine to copy a folder directly into another Jenkins instance. So I wrote a menu driven utility which uses Jenkins REST API calls to install plugins and jobs from one Jenkins instance to another.
For plugin migration:
GET request: {SOURCE_JENKINS_SERVER}/pluginManager/api/json?depth=1 will get you the list of plugins installed with their version.
You can send a POST request with the following parameters to install these plugins.
final_url=`{DESTINATION_JENKINS_SERVER}/pluginManager/installNecessaryPlugins`
data=`<jenkins><install plugin="{PLUGIN_NAME}#latest"/></jenkins>` (where, latest will fetch the latest version of the plugin_name)
auth=`(destination_jenkins_username, destination_jenkins_password)`
header=`{crumb_field:crumb_value,"Content-Type":"application/xml”}` (where crumb_field=Jenkins-Crumb and get crumb value using API call {DESTINATION_JENKINS_SERVER}/crumbIssuer/api/json
For job migration:
You can get the list of jobs installed on {SOURCE_JENKINS_URL} using a REST call, {SOURCE_JENKINS_URL}/view/All/api/json
Then you can get each job config.xml file from the jobs on {SOURCE_JENKINS_URL} using the job URL {SOURCE_JENKINS_URL}/job/{JOB_NAME}.
Use this config.xml file to POST the content of the XML file on {DESTINATION_JENKINS_URL} and that will create a job on {DESTINATION_JENKINS_URL}.
I have created a menu-driven utility in Python which asks the user to start plugin or Jenkins migration and uses Jenkins REST API calls to do it.
You can refer the JenkinsMigration.docx from this URL
jenkinsjenkinsmigrationjenkinsrestapi
Let us say we are migrating Jenkins LTS from PC1 to PC2 (irrispective of LTS version is same of upgraded).
It is easy to use ThinBackUp Plugin for migration or Upgrade of Jenkins version.
Step1: Prepare PC1 for migration
Manage Jenkins -> ThinbackUp -> Setting
Select correct options and directory for backup
If you need a job history and artifacts need to be added then please select 'Back build results' option as well.
Go back click on Backup Now.
Note: This Thinbackup will also take Plugin Backup which is optional.
Check the ThinbackUp folder must have a folder with current date and timestamp.
(wait for couple of minutes it might take some time.)
You are ready with your back, .zip it and copy to PARTICULAR (which will be 'Backup directory') directory in PC2.
Unzip ThinbackUp zipped folder.
Stop Jenkins Service in PC1.
Step2: Install Jenkins (Install using .war file or Paste archived version) in PC2.
Create Jenkins Service using command sc create <Jenkins_PC2Servicename> binPath="<Path_to_Jenkinsexe>/jenkins.exe"
Modify JENKINS_HOME/jenkins.xml if needed in PC2.
Run windows service <Jenkins_PC2Servicename> in PC2
Manage Jenkins -> ThinbackUp -> Setting
Make sure that you PARTICULAR path from step1 as Backup Directory in ThinBackup settings.
ThinbackUp -> Restore will give you a Dropdown list, choose a right backup (identify with date and timestamp).
Wait for some minutes and you have latest backup configurations including jobs history and plugins in PC2.
In case if there are additional changes needed in JENKINS_HOME/Jenkins.xml (coming from PC1 ThinbackUp which is not needed) then this modification need to do manually.
NOTE: If you are using Database setting of SCM in your Jenkins jobs then you need to take extra care as all SCM plugins do not support to carry Database settings with the help of ThinbackUp plugin.
e.g. If you are using PTC Integrity SCM Plugin, and some Jenkins jobs are using DB using Integrity, then it will create a directory JENKINS_Home/IntegritySCM, ThinbackUp will not include this DB while taking backup.
Solution: Directly Copy this JENKINS_Home/IntegritySCM folder from PC1 to PC2.

Resources