Jenkins build logs in slave nodes - jenkins

I am trying to push our jenkins build logs to S3.
I used Groovy plugin and the following script in Build phase
// This script should be run in a system groovy script build step.
// The FilePath class understands what node a path is on, not just the path.
import hudson.FilePath
// Get path to console log file on master.
logFile = build.getLogFile()
// Turn this into a FilePath object.
logFilePathOnMaster = new FilePath(logFile)
logFileName = build.envVars["JOB_BASE_NAME"] + build.envVars["RT_RELEASE_STAGING_VERSION"] + '.txt'
// Create remote file path obj to build agent.
remoteLogFile = new FilePath(build.workspace, logFileName)
// Copy contents of master's console log to file on build agent.
remoteLogFile.copyFrom(logFilePathOnMaster)
And then I am using S3 plugin to push .txt files to S3.
But this script fetches the build log file from the master node.
How are the build logs transferred from slave to master node ?
Can I access the build log file on my slave node without master's involvement at all ?
The slave node must be preserving the build logs while building somewhere ? I cant seem to find it.

I am not much familiar with Groovy but here is the solution which worked for me using shell script.
I am using Jenkins 'Node and Label parameter plugin' to run our java process on a slave node. Job is triggered using 'Build >> Execute Shell' option. The log is collected into a file as below :
sudo java -jar xxx.jar | sudo tee -a ${JOB_NAME}/${BUILD_NUMBER}.log 2>&1
This log file is then pushed to S3 :
sudo aws --region ap-south-1 s3 cp ${JOB_NAME}/${BUILD_NUMBER}.log s3://bucket/JenkinsLogs/${JOB_NAME}/${BUILD_NUMBER}.log
Its working perfectly for us. Hope it helps you too.

Related

Problems transferring build artifacts from Jenkins running in a docker container

I'm a little bit of a newb, with this CI/CD container stuff so please correct me anywhere I'm wrong.
I can't seem to find out how to send by npm build files created on my jenkins instance (workspace) to a remote server. I have a pipeline that successfully pulls in my github repo, does all my fun npm stuff (npm install, test, build). I see my build dir in my jenkins instance /workspace.
My environment is as follows. We have a server where docker (with Portainer) is installed. Jenkins is running in a container with a volume mounted (my react build dir goes here). No issues with the pipeline or building etc. I just can't figure out how to push my artifacts from my jenkins workspace directory to my 'remote' dev server.
I can successfully open a console in my jenkins container (portainer as the jenkins user) and scp files from the workspace directory using my remote server creds(but password is necessary).
I installed and used "Publish Over SSH" Jenkins plugin and get a successful "Test Configuration" from my setup.
I created my RSA keys on the REMOTE machine (that I'm trying to push my build files to).
I then pasted the private key (created without a password) into the plugin at the 'Use password authentication, or use a different key' section. Again, I get a successful test connection.
In my pipeline the last step is deploying and I use this command
sh 'scp -r build myusername#xx.xx.xx.xx:/var/files/react-tester'
I get a 'Permission denied (publickey,password).' error. I have no password associated with the rsa key. I tried both ways, creating the rsa key on the remote machine as my remote user, and the jenkins machine as the jenkins user. I've read examples of people creating the keys both ways, but not sure which user/machine combo to create the keys and paste to which section of the 'Publish Over SSH' plugin.
I'm out of ideas.
First, go to "Manage Jenkins" > "Credentials", add a new SSH credential of type "SSH Username with private key" and fill the "Username" and your private key (generate one if you haven't done it yet) fields (you can also upload one). Don't forget that you have to copy the generated public key to the ${SSH_USERNAME}/.ssh/authorized_keys file on the remote server.
I'm assuming you're using a scripted or DSL pipeline here. In your code, after you've builded your application, you can push it to your server adding this step to your pipeline:
pipeline {
stages {
stage("Pushing changes to remote server") {
steps {
script {
def remote_server = "1.2.3.4"
withCredentials([sshUserPrivateKey(credentialsId: 'my-key', keyFileVariable: 'SSH_KEY', passphraseVariable: '', usernameVariable: 'SSH_USERNAME')]) {
sh "scp -i \${SSH_KEY} build/ ${SSH_USERNAME}#${remote_server}:/var/files/react-tester/"
}
}
}
}
}
}
Best regards.

How to save Jenkins configuration?

Is there any way to save a pipeline configuration or an item configuration in Git or anywhere else, so that when my Jenkins machine is crashed, i can migrate the saved configuration in the new Jenkins instance?
I would ( as a start ) get yourself - https://wiki.jenkins.io/display/JENKINS/JobConfigHistory+Plugin which keeps history of all Changes made to Jobs , System config etc - has saved me multiple times.
Also , you could setup a cron job outside Jenkins to git push your Job config.
I setup to push the Jobs folder content ( including Build history - but you could exclude that - Correctly ignore all files recursively under a specific folder except for a specific file type ref ) .
My script ( i had SSH stuff previously setup )
cd /this/that/other/jenkins_data/jobs/
NOW=$(date +"%m-%d-%Y-%r")
git add .
git commit -m "Jenkins Backup $NOW"
git push -u origin jenkins-backup
This way gives me piece of mind , I have RSYNC to another box and I also have a Backup plugin running too... ( i was stung once - not again! )
Hope this helps.
All your jobs is stored in config.xml files inside $JENKINS_HOME/jobs/<path_to_your_job> folders. So, you can just backup these config.xml files (or you can backup all Jenkins configuration by saving full $JENKINS_HOME folder).
Hi you can write the pipeline script in file and publish that file into the git.
After that just create the pipeline job in the jenkins and use Pipeline script from SCM option for pipeline script.
The other option is take backup of Jenkins home directory to external hard disk(keep project workspace outside the jenkins home to reduce backup size).

Execution a deployment script on a remote ssh server through a Jenkins pipeline

I've got a Jenkins pipeline containing stages for source loading, building and deploying on a remote machine through SSH. The problem is about the last one. I saved the script of the following template on the remote server:
#!/bin/bash
bash /<pathTo>/jboss-cli.sh --command="deploy /<anotherPath>/service.war --force"
It works fine if executed in a terminal connected to the remote server.
The best outcome I've received through Jenkins is
/<pathTo>/jboss-cli.sh: line 87: usr/bin/java/bin/java: No such file or directory
in Jenkins console output.
Tried switching between bash and sh, exporting path to java in the pipeline script etc.
Any suggestions are appreciated.
Thanks!
p.s. Execution call from Jenkins looks like:
sh """
ssh -o StrictHostKeyChecking=no $connectionName 'bash /<pathToTheScript>/<scriptName>.sh'
"""
line 87: **usr/bin/java/bin/java**: No such file or directory
As per error line it is considering path from usr not /usr. Can you check if this is what the problem is?
Sorry, I know this should be in comments section but I don't have right to add comments yet.

Download multiple files from SFTP server as a Jenkins job

I am trying to create a Jenkins job which will pull multiple files from a SFTP server to local machine.
By using ssh2easy plugin in Jenkins i am able to connect to SFTP server and pull single file as shown below.
You can use SSH2Easy plugin with following configuration in job:
remoteFile: /remoteFolderPath/*
localFolder: jobs/jobName/workspace/
fileName: temp/
localFolder is path to job workspace - this depends on jenkins configuration where workspace is located
fileName must end with "/" and the folder must exist in workspace.
You can use a shell build step with curl.
curl command to use:
curl --insecure sftp://username:urlencodedPassword#somedomain.com
Be aware that username/password have to be at the beginning of the domain name, separated by an #.

Execute a script from jenkins pipeline

I have a jenkins pipeline that builds a java artifact,
copies it to a directory and then attempts to execute a external script.
I am using this syntax within the pipeline script to execute the external script
dir('/opt/script-directory') {
sh './run.sh'
}
The script is just a simple docker build script, but the build will fail
with this exception:
java.io.IOException: Failed to mkdirs: /opt/script-directory#tmp/durable-ae56483c
The error is confusing because the script does not create any directories. It is just building a docker image and placing the freshly built java artifact in that image.
If I create a different job in jenkins that executes the external script as
its only build step and then call that job from my pipeline script using this syntax:
build 'docker test build'
everything works fine, the script executes within the other job and the pipeline
continues as expected.
Is this the only way to execute a script that is external to the workspace?
What am I doing wrong with my attempt at executing the script from within
the pipeline script?
The issue is that the jenkins user (or whatever the user is that runs the Jenkins slave process) does not have write permission on /opt and the sh step wants to create the script-directory#tmp/durable-ae56483c sub-directory there.
Either remove the dir block and use the absolute path to the script:
sh '/opt/script-directory/run.sh'
or give write permission to jenkins user to folder /opt (not preferred for security reasons)
Looks like a bug in Jenkins, durable directories are meant to store recovery information e.g. before executing an external script using sh.
For now all you can do is make sure that /opt/script-directory has +r +w and +x set for jenkins user.
Another workaround would be not to change the current directory, just execute sh with it:
sh '/opt/script-directory/run.sh'
I had a similar concern when trying to execute a script in a Jenkins pipeline using a Jenkinsfile.
I was trying to run a script restart_rb.sh with sudo.
To run it I specified the present working directory ($PWD):
sh 'sudo sh $PWD/restart_rb.sh'

Resources