I am trying to execute the curl command from Jenkins declarative pipeline on a remote server, however it is running on Jenkins node instead of server.
pipeline {
agent {
label
}
stages {
stage('TEst ssh') {
steps {
script {
sh '''
ssh -t user#test << ENDSSH
echo "ssh to server"
cd /opt/apps
url=$(curl -H 'X-JFrog-Art-Api: Artifactory_token' 'Artifactory_url' |jq -r '.uri')
echo $url
ENDSSH
'''
}
}
}
}
}
I am getting "Curl command not found". can anyone suggest a solution for the same?
Just put the full path to curl in the command, /bin/curl? Or change the cmd to set and check tbe path, then figure out why /bin is not in the path or if curl is even there.
Note: remote ssh login is not the same login sequence as a remote shell.
Related
I am trying the following on Jenkins:
steps {
script{
sshagent (credentials: ['creds']) {
sh '''
ssh -o StrictHostKeyChecking=no -tt jenkins#${IP} "
cd
"
'''
}
}
}
Obviously i am just trying to get the current directory. And as output i'm getting : [2JConnection to IP closed
Has anyone seen this before?
The issue was the -tt option, i dropped it and it worked just fine. Also, when you script windows commands like this they have to be coded in one line for some reason.
I have 2 aws ubuntu instance: 1st-server and 2nd-server.
Below is my jenkins pipeline script which create docker image and runs container on 1st-server and push the image to docker hub repo. That's working fine.
I want to pull image and deploy it on 2nd-server.
When I do ssh for 2nd server through below pipeline script but it logins to 1st-server, even if ssh credential ('my-ssh-key') are of 2nd-server. I'm confused how it logging to 1st-server and I checked with touch commands so the file is creating on 1st-server.
pipeline {
environment {
registry = "docker-user/docker-repo"
registryCredential = 'docker-cred'
dockerImage = ''
}
agent any
stages {
stage('Cloning Git') {
steps {
git url: 'https://github.com/git-user/jenkins-flask-tutorial.git/'
}
}
stage('Building image') {
steps{
script {
sh "sudo docker build -t flask-app-one ."
sh "sudo docker run -p 5000:5000 --name flask-app-one -d flask-app-one "
sh "docker tag flask-app-one:latest docker-user/myrepo:flask-app-push-test"
}
}
}
stage('Push Image') {
steps{
script {
docker.withRegistry( '', registryCredential ) {
sh "docker push docker-user/docker-repo:flask-app-push-test"
sshagent(['my-ssh-key']) {
sh 'ssh -o StrictHostKeyChecking=no ubuntu#2ndserver && cd /home/ubuntu/ && sudo touch test-file && docker pull docker-user/docker-repo:flask-app-push-test'
}
}
}
}
}
My question is, how to login to 2nd server and pull the docker image on 2nd server via through jenkins pipeline script? Help me out where I'm doing wrong.
This is more of an alternative than a solution. You can execute the remote commands as part of ssh. This will execute the command on the server and disconnect.
ssh name#ip "ls -la /home/ubuntu/"
Does anybody know why:
…
steps
{
script
{
sshagent(credentials: ['jenk'])
{
sh "git remote show …" //This does not work !
bat "git remote show …" //This works ??
}
}
}
...
The 'jenk' credentials are managed via Jenkins->credentials->System->global credentials
EDIT:
Sorry forgot the error msg:
Host key verification failed
fatal: Could not read from remote repository
Jenkins was configured using CYGWIN_NT-6.3-WOW (i686 Cygwin) for the sh commands.
After all this commands cleared everything:
if (isUnix())
{
echo "Jenkins runs on Linux"
}
else
{
echo "Jenkins runs on Windows"
}
echo "show shell kernel version (uname -a) : "
def res = sh (script: "uname -a", returnStdout: true)
echo "${res}" //=>CYGWIN_NT-6.3-WOW...
res2 = sh (script: "ls -al ~/.ssh", returnStdout: true)
echo "${res2}"
So the solution to the problem above is therefore adding the ssh-keys to cygwin
If you need your credentials you could do this:
https://codurance.com/2019/05/30/accessing-and-dumping-jenkins-credentials/
i am trying to ssh into a remote host and then execute certain commands on the remote host's shell. Following is my pipeline code.
pipeline {
agent any
environment {
// comment added
APPLICATION = 'app'
ENVIRONMENT = 'dev'
MAINTAINER_NAME = 'jenkins'
MAINTAINER_EMAIL = 'jenkins#email.com'
}
stages {
stage('clone repository') {
steps {
// cloning repo
checkout scm
}
}
stage('Build Image') {
steps {
script {
sshagent(credentials : ['jenkins-pem']) {
sh "echo pwd"
sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no'
sh "echo pwd"
sh 'sudo -i -u root'
sh 'cd /opt/docker/web'
sh 'echo pwd'
}
}
}
}
}
}
But upon running this job it executes sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no' successfully but it stops there and does not execute any further commands. I want to execute the commands that are written after ssh command inside the remote host's shell. any help is appreciated.
I would try something like this:
sshagent(credentials : ['jenkins-pem']) {
sh "echo pwd"
sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no "echo pwd && sudo -i -u root && cd /opt/docker/web && echo pwd"'
}
I resolve this issue
script
{
sh """ssh -tt login#host << EOF
your command
exit
EOF"""
}
stage("DEPLOY CONTAINER"){
steps {
script {
sh """
#!/bin/bash
sudo ssh -i /path/path/keyname.pem username#serverip << EOF
sudo bash /opt/filename.sh
exit 0
<< EOF
"""
}
}
}
There is a better way to run commands on remote using SSH. I know this is late answer but I just explored this thing so would like to share and this will help others to resolve this problem easily.
I just found this link helpful on how to run multiple commands on remote using SSH. Also we can run multiple commands conditionally as mentioned in above blog.
By going through it, I found the syntax:
ssh username#hostname "command1; command2;commandN"
Now, how to run command inside remote hots using SSH in Jenkins pipeline?
Here is the solution:
pipeline {
agent any
environment {
/*
define your command in variable
*/
remoteCommands =
"""java --version;
java --version;
java --version """
}
stages {
stage('Login to remote host') {
steps {
sshagent(['ubnt-creds']) {
/*
Provide variable as argument in ssh command
*/
sh 'ssh -tt username#hostanem $remoteCommands'
}
}
}
}
}
Firstly and optionally, you can define a variable that holds all commands separated by ;(semicolon) and then pass it as parameter in command.
Another way, you can also pass your commands directly to ssh command as
sh "ssh -tt username#hostanem 'command1;command2;commandN'"
I have used it in my code and it's working great!
see the output here
Happy Learning :)
I am using Jenkins Pipline and Groovy script to download a zip on slave machine of jenkins.
Following is my code:
pipeline {
agent { label '<my slave label>' }
stage('Download') {
steps {
script {
def url = "<server url>"
def processDownload = ['bash', '-c', "curl -g -k --noproxy \"*\" -o <output-dir> \"${url}\""].execute()
processDownload.waitFor()
def processUnzip = ['bash', '-c', "7z e lwbs.zip"].execute()
processUnzip.waitFor()
}
}
}
}
I am getting following error:
Warning: Failed to create the file Warning:
output-dir/newFile.zip: No such file or directory
I have checked following:
When I use the same curl command using command prompt, it run's successfully.
I have also ensured that proper user permissions are granted to
allow jenkins to write to this directory.
The directory exists and there are no spaces in the directory url
There is enough disk space available on slave
Server URL and certificates are correct
Many SO but none mentions issue on jenkins slave
Is there anything I am missing?
Any help is appreciated. Thank you.
After long hours of research, I found out the bash command triggered using following command
def processDownload = ['bash', '-c', "curl -g -k --noproxy \"*\" -o <output-dir> \"${url}\""].execute()
Jenkins will always execute it on Master.
When I change it to normal shell command, Jenkins is correctly executing it on slave machine. Moreover to unzip, I used Pipeline Utility Steps plugin which provides unzipping functionality which gets executed on slave. Following is my working code:
stage('Download') {
steps {
script {
url = "<server url>"
sh "curl -k --noproxy \"*\" -o \"<output-dir>\" \"${url}\""
unzip(dir: '', glob: '', zipFile: 'fileName.zip')
}
}
}