Could I use many remotes in one jenkins pipeline using SSH Pipeline Steps plugin?
Now my pipeline looks like this:
def remote = [:]
remote.name = 'PRE-DEV'
remote.host = 'x.x.x.x'
remote.user = 'jenkins'
remote.identityFile = '/var/lib/jenkins/.ssh/id_rsa'
remote.allowAnyHosts = true
remote.agentForwarding = true
pipeline {
agent any
stages{
stage('BUILD'){
steps{
sshCommand remote: remote, command: "build commands"
}
}
stage('UNIT TESTS'){
steps{
sshCommand remote: remote, command: "tests commands"
}
}
stage('DEPLOY TO DEV'){
steps{
sshCommand remote: remote, command: "scp artifacts push to other vm"
}
}
}
Now i need additional stage ('RUN ON DEV'), where i can run my artifacts on other VM. How can i do it in the same pipeline?
solution one:
you can just define another dict like blow:
def secondRemote = [:]
secondRemote.name = 'PRE-DEV'
secondRemote.host = 'your new host'
secondRemote.user = 'jenkins'
secondRemote.identityFile = '/var/lib/jenkins/.ssh/id_rsa'
secondRemote.allowAnyHosts = true
secondRemote.agentForwarding = true
then use it by
sshCommand remote: secondRemote, command: "your new command"
solution two:
store your private key in jenkins credentials, then using ssh-agent plugin.
https://www.jenkins.io/doc/pipeline/steps/ssh-agent/
Related
I'm building war file by using maven and sending this build to another server via Jenkins ssh Publisher plugin.
stage('Deploy develop build to Test Server') {
when {
branch 'develop'
}
steps {
// sends war file to test_server
dir("${WORKSPACE}/ag.mycompany.feature/target/") {
sshPublisher(
alwaysPublishFromMaster: true,
continueOnError: false,
failOnError: true,
publishers: [
sshPublisherDesc(
configName: "test_server",
transfers: [sshTransfer(sourceFiles: "vc-admin_${VERSION}_${BUILD_NUMBER}_dev.war", remoteDirectory: '/wars/main_vc_admin/develop')],
verbose: true
)
]
)
}
}
}
this part working very well, I can see my war file with version and build number
then, I'm trying to cp these files (in remote server) to another folder and build a docker image via docker-compose.
connection to remote server via withCredentials is OK, and I tested with some sudo command like 'docker ps' and working well.
but I can't pass variables like VERSION and BUILD_NUMBER to sshCommand
is there any way to pass these variables?
And all examples in internet that I saw are in script{ ... } block for withCredentials and sshPublisher, that's why I used script block in declerative pipeline. I don't know how to create remote variable by using declerative pipeline.
stage("build docker containers") {
when {
branch 'develop'
}
steps {
script{
withCredentials([sshUserPrivateKey(credentialsId: 'test_server',
keyFileVariable: 'test_user',
passphraseVariable: '',
usernameVariable: 'test_user')]) {
// code block
def remote = [ : ]
remote.name = "MY_TEST_SERVER"
remote.host = "1.1.1.1"
remote.user = "user"
remote.password = "pass"
remote.allowAnyHosts = true
remote.identityFile = test_user
remote.pty = true
sshCommand remote: remote, command="cp /home/user/wars/main_vc_admin/develop/vc-admin_${VERSION}_${BUILD_NUMBER}_dev.war /home/user/main-vc-deployment/backend/vc-admin.war"
sshCommand remote: remote, sudo: true, command: "docker-compose --file /home/user/main-vc-deployment build && docker-compose --file /home/user/main-vc-deployment run -d"
}
}
}
}
I'm new to Jenkins and I can't figure it out how to transfer file from one server to another via SSH Steps Plugin.
Can you show some pipeline example?
Would be very grateful for some help!
Already I have tried to use:
def remote = [:]
remote.name = "Nginx"
remote.host = "1.2.3.4"
remote.allowAnyHosts = true
node {
withCredentials([sshUserPrivateKey(credentialsId: 'Nginx_inst', keyFileVariable: 'identity', passphraseVariable: '', usernameVariable: 'myUser')]) {
remote.user = myUser
remote.identityFile = identity
stage("SSH Transfer") {
sshCommand remote: remote, command: "sudo cp /home/myUser/docs.zip /var/www/html"
// sshPut remote: remote, from: 'docs.zip', into: '/var/www/html/'
sshCommand remote: remote, command: "sudo unzip -tq /var/www/html/docs.zip"
}
}
}
### But I take an error:
Executing command on ****[1.2.3.4]: sudo cp /home/****/docs.zip /var/www/html sudo: false
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
com.jcraft.jsch.JSchException: USERAUTH fail
at com.jcraft.jsch.UserAuthPublicKey.start(UserAuthPublicKey.java:119)
at com.jcraft.jsch.Session.connect(Session.java:470)
..............
Finished: FAILURE
This is related to SSH keys. Your error is because of issues in keys used while doing SSH from Jenkins to target server.
Have a look at "com.jcraft.jsch.JSchException: Auth fail" with working passwords
I have a Jenkinsfile which implements a pipeline and in one of the stages and under a node, if I say unstash 'myfile', where this 'myfile' will be available on the node? My requirement is I need to access this file and copy to a known remote server (this remote server is not part of Jenkins pool) as part of the Jenkinsfile script.
You can use SSH Pipeline Steps to copy file to a remote server. Here is example how to send file from job workspace to remote host:
remote = [:]
remote.name = "name"
remote.host = "remote_ip"
remote.allowAnyHosts = true
remote.failOnError = true
withCredentials([usernamePassword(credentialsId: 'credentials_name', passwordVariable: 'password', usernameVariable: 'username')]) {
remote.user = username
remote.password = password
}
sshPut remote: remote, from: 'myfile', into: 'folder_on_remote_host'
I say unstash 'myfile', where this 'myfile' will be available on the node?
You don't do unstash 'myfile', you do unstash 'my_stash', where my_stash is the name you used when you saved your stash previously. The stash can contain one file, or it can contain a whole directory tree. Its contents is defined at the moment you stash it (relative to ${WORKSPACE} on the node running stash) and it's unstashed in exactly the same way, relative to ${WORKSPACE} on the node running unstash.
The workspace is located according to your agent configuration (at my place, it's in /Users/jenkins/workspace/<a folder Jenkins creates>), but for all the practical purposes -- as your steps on the node are running in that folder too -- you can refer to it as ., e.g.
stage ('stash') {
node { label 'one' }
steps {
script {
sh "echo 1 > myfile.txt" // runs in $WORKSPACE, creates $WORKSPACE/myfile.txt
stash name: "my_stash", includes: "myfile.txt" // relative to $WORKSPACE
}
}
}
stage ('unstash') {
node { label 'two' }
steps {
script {
unstash name: "my_stash" // runs in $WORKSPACE, creates $WORKSPACE/myfile.txt
sh "cat ./myfile.txt"
}
}
}
I am working on a pipeline project on Jenkins and Git and I was wondering if it was possible to establish an SSH connection credential only with username and password. That mean without a group of private and public key.
I see nobody doing it while It seems theorically possible so I'm asking the question
Thanks in advance,
Taeith
You can with SSH Pipeline Steps. Here's example how to use it with credentials stored in Credentials
stage ('SSH') {
steps {
script{
remote = [:]
remote.name = "name"
remote.host = "host_name_or_ip"
remote.allowAnyHosts = true
remote.failOnError = true
withCredentials([usernamePassword(credentialsId: 'my_credentials', passwordVariable: 'password', usernameVariable: 'username')]) {
remote.user = username
remote.password = password
sshCommand remote: remote, command: "some_command"
}
}
}
}
I am quite new with Jenkins, so maybe question is obvious.
I have Jenkins on Windows machine, and I need to run the command on remote nix machine, where I have an ssh access (by username / password). I have a pipeline and using ssh-steps plugin for pipeline I could connect and execute command, but I need to get output of command to go forward, and I couldn't find a correct way to do it.
def remote = [:]
remote.name = 'UAT'
remote.host = 'test.domain'
remote.user = 'username'
remote.password = 'pass'
remote.allowAnyHosts = true
stage('Remote SSH') {
sshCommand remote: remote, command: "ls -ll"
}
Is it possible to do it with this plugin or I need to use another one? As I understand, this plugin is specially created for using ssh in pipeline script.
Try this:
def remote = [:]
remote.name = 'UAT'
remote.host = 'test.domain'
remote.user = 'username'
remote.password = 'pass'
remote.allowAnyHosts = true
stage('Remote SSH') {
def commandResult = sshCommand remote: remote, command: "ls -ll"
echo "Result: " + commandResult
}
It's not easy to figure out because is not documented!
Keep in mind that if the sshCommand fails, it will not return anything.
You can work around this by running sshCommand with 'failOnError: false' flag and then processing the return yourself.