Download multiple files from SFTP server as a Jenkins job - jenkins

I am trying to create a Jenkins job which will pull multiple files from a SFTP server to local machine.
By using ssh2easy plugin in Jenkins i am able to connect to SFTP server and pull single file as shown below.

You can use SSH2Easy plugin with following configuration in job:
remoteFile: /remoteFolderPath/*
localFolder: jobs/jobName/workspace/
fileName: temp/
localFolder is path to job workspace - this depends on jenkins configuration where workspace is located
fileName must end with "/" and the folder must exist in workspace.

You can use a shell build step with curl.
curl command to use:
curl --insecure sftp://username:urlencodedPassword#somedomain.com
Be aware that username/password have to be at the beginning of the domain name, separated by an #.

Related

Problems transferring build artifacts from Jenkins running in a docker container

I'm a little bit of a newb, with this CI/CD container stuff so please correct me anywhere I'm wrong.
I can't seem to find out how to send by npm build files created on my jenkins instance (workspace) to a remote server. I have a pipeline that successfully pulls in my github repo, does all my fun npm stuff (npm install, test, build). I see my build dir in my jenkins instance /workspace.
My environment is as follows. We have a server where docker (with Portainer) is installed. Jenkins is running in a container with a volume mounted (my react build dir goes here). No issues with the pipeline or building etc. I just can't figure out how to push my artifacts from my jenkins workspace directory to my 'remote' dev server.
I can successfully open a console in my jenkins container (portainer as the jenkins user) and scp files from the workspace directory using my remote server creds(but password is necessary).
I installed and used "Publish Over SSH" Jenkins plugin and get a successful "Test Configuration" from my setup.
I created my RSA keys on the REMOTE machine (that I'm trying to push my build files to).
I then pasted the private key (created without a password) into the plugin at the 'Use password authentication, or use a different key' section. Again, I get a successful test connection.
In my pipeline the last step is deploying and I use this command
sh 'scp -r build myusername#xx.xx.xx.xx:/var/files/react-tester'
I get a 'Permission denied (publickey,password).' error. I have no password associated with the rsa key. I tried both ways, creating the rsa key on the remote machine as my remote user, and the jenkins machine as the jenkins user. I've read examples of people creating the keys both ways, but not sure which user/machine combo to create the keys and paste to which section of the 'Publish Over SSH' plugin.
I'm out of ideas.
First, go to "Manage Jenkins" > "Credentials", add a new SSH credential of type "SSH Username with private key" and fill the "Username" and your private key (generate one if you haven't done it yet) fields (you can also upload one). Don't forget that you have to copy the generated public key to the ${SSH_USERNAME}/.ssh/authorized_keys file on the remote server.
I'm assuming you're using a scripted or DSL pipeline here. In your code, after you've builded your application, you can push it to your server adding this step to your pipeline:
pipeline {
stages {
stage("Pushing changes to remote server") {
steps {
script {
def remote_server = "1.2.3.4"
withCredentials([sshUserPrivateKey(credentialsId: 'my-key', keyFileVariable: 'SSH_KEY', passphraseVariable: '', usernameVariable: 'SSH_USERNAME')]) {
sh "scp -i \${SSH_KEY} build/ ${SSH_USERNAME}#${remote_server}:/var/files/react-tester/"
}
}
}
}
}
}
Best regards.

Reading config file in DSL build on agent host

I try to configure Jenkins' seed job, where whole business is in provided DSL script. I want to seperate that script from its configuration, which I want to locate in additional yml file. When I try to read that file:
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.Yaml
def workDir = SEED_JOB.getWorkspace()
def config = new Yaml().load(("${workDir}/config.yml" as File).text)
I receive error
java.io.FileNotFoundException: /var/lib/jenkins/workspace/test.dsl/config.yml (No such file or directory)
I suppose that Jenkins is looking for the file on a master host, not an agent node where workspace is located.
Is it possible to read yml file in DSL build step on the agent node? Or maybe I have to execute that seed job always on my master host?
This seems not possible as the jobDsl script is executed on master. You can try force to run the job on master with label master.
From the documentation in section Script location:
Job DSL scripts are executed on the Jenkins master node, but the seed job's workspace which contains the script files may reside on a build node. This mean that direct access to the file specified by FILE may not be possible from a DSL script. See Distributed builds for details.

Jenkins build logs in slave nodes

I am trying to push our jenkins build logs to S3.
I used Groovy plugin and the following script in Build phase
// This script should be run in a system groovy script build step.
// The FilePath class understands what node a path is on, not just the path.
import hudson.FilePath
// Get path to console log file on master.
logFile = build.getLogFile()
// Turn this into a FilePath object.
logFilePathOnMaster = new FilePath(logFile)
logFileName = build.envVars["JOB_BASE_NAME"] + build.envVars["RT_RELEASE_STAGING_VERSION"] + '.txt'
// Create remote file path obj to build agent.
remoteLogFile = new FilePath(build.workspace, logFileName)
// Copy contents of master's console log to file on build agent.
remoteLogFile.copyFrom(logFilePathOnMaster)
And then I am using S3 plugin to push .txt files to S3.
But this script fetches the build log file from the master node.
How are the build logs transferred from slave to master node ?
Can I access the build log file on my slave node without master's involvement at all ?
The slave node must be preserving the build logs while building somewhere ? I cant seem to find it.
I am not much familiar with Groovy but here is the solution which worked for me using shell script.
I am using Jenkins 'Node and Label parameter plugin' to run our java process on a slave node. Job is triggered using 'Build >> Execute Shell' option. The log is collected into a file as below :
sudo java -jar xxx.jar | sudo tee -a ${JOB_NAME}/${BUILD_NUMBER}.log 2>&1
This log file is then pushed to S3 :
sudo aws --region ap-south-1 s3 cp ${JOB_NAME}/${BUILD_NUMBER}.log s3://bucket/JenkinsLogs/${JOB_NAME}/${BUILD_NUMBER}.log
Its working perfectly for us. Hope it helps you too.

How to include CSV file in JMETER running in Jenkins

I am using Jenkins as my deployment pipeline. I have a JMETER project that I will be executing in a build step in Jenkins. That JMETER project has a dependency on a csv file for parameters. How do I get that file included in the Jenkins pipeline and how do I tell JMETER where to look for it in the csv data set config?
I could also do it via a gradle command if that is an option.
Thanks.
The easiest option is running JMeter in command-line non-GUI mode, the relevant Jenkins Pipeline snippet would be:
node {
stage 'Run JMeter Test'
bat 'c:/jmeter/bin/jmeter.bat -n -t c:/jmeter/extras/Test.jmx -l test.jtl'
}
The above setup assumes Windows operating system, if your Jenkins master or build agent is running Linux, Unix or MacOSX - just change bat to sh. You will also need to amend JMeter installation path to reflect your environment. See Running a JMeter Test via Jenkins Pipeline - A Tutorial to see example configuration.
In case of command-line non-GUI mode of JMeter execution you need to copy your CSV file(s) to "bin" folder of your JMeter installation.
In case of using JMeter Ant Task - the same approach, drop your CSV file(s) to JMeter's "bin" folder
In case of JMeter Maven Plugin you will need to copy CSV file(s) to src/test/jmeter folder of your Maven project.
And finally you can just use full path(s) (not relative) to CSV file(s) in CSV Data Set Config elements.

Copy file from Jenkins master to slave in Pipeline

I have some windows slave at my Jenkins so I need to copy file to them in pipeline. I heard about Copy To Slave and Copy Artifact plugins, but they doesn't have pipeline syntax manual. So I don't know how to use them in pipeline.
Direct copy doesn't work.
def inputFile = input message: 'Upload file', parameters: [file(name: 'parameters.xml')]
new hudson.FilePath(new File("${ENV:WORKSPACE}\\parameters.xml")).copyFrom(inputFile)
This code returns and error:
Caused: java.io.IOException: Failed to copy /var/lib/jenkins/jobs/_dev/jobs/(TEST)job/builds/107/parameters.xml to d:\Jenkins\workspace\_dev\(TEST)job\parameters.xml
Is there any way to copy file from master to slave in Jenkins Pipeline?
As I understand copyFrom is executed on your Windows node, therefore the source path is not accessible.
I think you want to look into the stash/unstash steps (Jenkins Pipeline: Basic Steps), which work across different nodes. Also this example might be helpful.
Pipeline DSL context runs on master node even that your write node('someAgentName') in your pipeline.
Try to use stash/unstash, but it is bad for large files.
Try External Workspace Manager Plugin. It has
pipelines steps and good for large files.
Try to use an intermediate storage. archive() and sh("wget $url") will be helpful.
If the requirement is to copy an executable to the test slave and to publish the test results, this is easy to do without the Copy to Slave plugin.
A shared folder should be created on each test slave (normal Windows shared folder).
After build: Build script copies the executable to the shared directory on each slave. A simple batch script using copy command is sufficient for this.
stage ('Copy to slaves') {
steps {
bat 'call "copy-to-slave.bat"'
}
}
During test: The test script copies the executable to another directory and runs it.
After test: Post-build action "Publish Robot Framework test results" can be used to report the test results. It is not necessary to copy the test result files back to the master first.
I recommend on Pipeline: Phoenix AutoTest plugin
Jenkins plugin website:
https://plugins.jenkins.io/phoenix-autotest/#documentation
GitHub repository of plugin:
https://github.com/jenkinsci/phoenix-autotest-plugin

Resources