Jenkins pipeline: kubectl command not found - jenkins

I'm running Jenkins locally and this is the last stage of my Jenkinsfile (after following this tutorial):
stage('Deploy to K8s') {
steps {
sshagent(credentials: ['k8s_key']) {
sh 'scp -r -o StrictHostKeyChecking=no localPath/deployment-remes-be.yaml <user>#<ip_address>:/opt/kubernetes-system/backend'
script {
try {
sh 'ssh <user>#<ip_address> kubectl apply -f /opt/kubernetes-system/backend/deployment-remes-be.yaml --kubeconfig=~/.kube/config'
}
catch(error) {
}
}
}
}
}
When I run the pipeline it completes without any blocking errors, but when I check the logs I can see this:
The copy before the apply command is working. I have microk8s installed on the Debian server I'm trying to deploy to and if I run the apply command manually then it's working fine. I've created the .kube/config file as shown here but using the --kubeconfig file doesn't make any difference. It also doesn't matter if I use microk8s.kubectl, I always get this message.
I have these plugins installed:
What can I do here to make the apply work from the pipeline?

In this situation where the error thrown is that the executable command is not found in the path, then the absolute path should be attempted as a first step. The shell step method can be updated accordingly:
sh 'ssh <user>#<ip_address> /path/to/kubectl apply -f /opt/kubernetes-system/backend/deployment-remes-be.yaml --kubeconfig=~/.kube/config'

Related

Jenkins job getting stuck on execution of docker image as the agent

I have installed Jenkins and Docker inside a VM. I am using Jenkins pipeline project and my jenkins declarative pipeline looks like this.
pipeline {
agent {
docker { image 'node:7-alpine' }
}
stages {
stage('Test') {
steps {
echo 'Hello Nodejs'
sh 'node --version'
}
}
}
}
It is a very basic pipeline following this link https://jenkins.io/doc/book/pipeline/docker/
When I try to build my jenkins job, it prints Hello Nodejs, but gets stuck at the next instruction i.e. execution of shell command. After 5 minutes, the job fails with this error
process apparently never started in /var/lib/jenkins/workspace/MyProject#tmp/durable-c118923c
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
ERROR: script returned exit code -2
I am not understanding why it is not executing the sh command.
If I make it as agent any, it executes the sh command.
I am not sure that it will help but I remember that node image is launched under root account by default. Jenkins uses its own ID when launching a container. So, probably, it's a permissions issue. Try to add -u 0 argument:
agent {
docker {
image 'node:7-alpine'
args '-u 0'
}
}

jenkins pipeline. Ssh to a server get stuck on job

I need to ssh to a server from a simple jenkin pipeline and make a deploy which is simply moving to a directory and do a git fetch and some other comands (nmp install among others). Thing is that when jenkin job ssh to the remote server it connects ok but then It gets stucked, I have to stop it. I just now modify the script to simply do a "ssh to server " and a "pwd command" to go to the easiest but it connects to it and it get stuck untill I abort. What Am I missing? here is the simpe pipeline script and the output on an screenshot
pipeline {
agent any
stages {
stage('Connect to server') {
steps {
sh "ssh -t -t jenkins#10.x.x.xx"
sh "pwd"
}
}
stage('branch status') {
steps {
sh "git status"
}
}
}
}
Jenkins executes each "sh" step as a separate shell script. Content is written to a temporary file on Jenkins node and only then executed. Each command is executed in separate session and is not aware of previous one. So neither ssh session or changes in environment variable will persist between the two.
More importantly though, you are forcing pseudo-terminal allocation with -t flag. This is pretty much opposite to what you want to achieve, i.e. run shell commands non-interactively. Simply
sh "ssh jenkins#10.x.x.xx pwd"
is enough for your example to work. Placing the commands on separate lines would not work with regular shell script, regardless of Jenkins. However you still need to have private key available on node, otherwise the job will hang, waiting for you to provide password interactively. Normally, you will want to use SSH Agent Plugin to provide private key at runtime.
script {
sshagent(["your-ssh-credentals"]) {
sh "..."
}
}
For execution on longer commands see What is the cleanest way to ssh and run multiple commands in Bash?

How to run a docker-compose instance in jenkins pipeline

I've set up a home based CI server for working with a personal project. Below you can see what happens for the branch "staging". It works fine, however the problems with such a pipeline config are:
1) The only way to stop the instance seem to be to abort the build in jenkins whiсh leads to the exit code 143 and build marked as red instead of green
2) If the machine reboots I have to trigger build manually
3) I suppose there should be a better way of handling this?
Thanks
stage('Staging') {
when {
branch 'staging'
}
environment {
NODE_ENV = 'production'
}
steps {
sh 'docker-compose -f docker-compose/staging.yml build'
sh 'docker-compose -f docker-compose/staging.yml up --abort-on-container-exit'
}
post {
always {
sh 'docker-compose -f docker-compose/staging.yml rm -f -s'
sh 'docker-compose -f docker-compose/staging.yml down --rmi local --remove-orphans'
}
}
}
So, what's the goal here? Are you trying to deploy to staging? If so, what do you mean by that? If jenkins is to launch a long running process (say a docker container running a webserver) then the shell command line must be able to start and then have its exit status tell jenkins pipeline if the start was successful.
One option is to wrap the docker compose in a script that executes, checks and exits with the appropriate exit code. Another is to use yet another automation tool to help (e.g. ansible)
The first question remains, what are you trying to get jenkins to do and how on the commandline will that work. If you can model the command line then you can encapsulate in a script file and have jenkins start it.
Jenkins pipeline code looks like groovy and is much like groovy. This can make us believe that adding complex logic to the pipeline is a good idea, but this turns jenkins into our IDE and that's hard to debug and a trap into which I've fallen several times.
A somewhat easier approach is to have some other tool allow you to easily test on the commandline and then have jenkins build the environment in which to run that command line process. Jenkins handles what it is good at:
scheduling jobs
determining on which nodes jobs run
running steps in parallel
making the output pretty or easily understood by we carbon based life forms.
I am using parallel stages.
Here is a minimum example:
pipeline {
agent any
options {
parallelsAlwaysFailFast() // https://stackoverflow.com/q/54698697/4480139
}
stages {
stage('Parallel') {
parallel {
stage('docker-compose up') {
steps {
sh 'docker-compose up'
}
}
stage('test') {
steps {
sh 'sleep 10'
sh 'docker-compose down --remove-orphans'
}
}
}
}
}
post {
always {
sh 'docker-compose down --remove-orphans'
}
}
}

Unable to change a directory inside a Docker container through a Jenkins declarative pipeline

I'm trying to change the current directory using the dir command outlined here: https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#code-dir-code-change-current-directory
I've edited my pipeline to resemble something like this:
pipeline {
agent { dockerfile true }
stages {
stage('Change working directory...') {
steps {
dir('/var/www/html/community-edition') {
sh 'pwd'
}
}
}
}
}
It doesn't change the directory at all but instead tries to create a directory on the host and fails with java.io.IOException: Failed to mkdirs: /var/www/html/community-edition
Using sh cd /var/www/html/community-edition doesn't seem to work either. How do I change the directory in the container? Someone else seems to have had the same issue but had to change his pipeline structure to change the directory and doesn't sound like a reasonable fix. Isn't the step already being invoked in the container? https://issues.jenkins-ci.org/browse/JENKINS-46636
I had the same problem yesterday. It seems to be a bug that causes dir() not to change the directory when used inside a container. I've got it to work by executing the cd and pwd command at once, like this:
sh '(cd //var/www/html/community-edition && pwd)'
I had same issue and this worked for me when I had "ws" in jenkinsfile pipeline:
stage('prepare') {
steps {
ws('/var/jenkins_home/workspace/pipeline#script/desiredDir') {
sh ''

permission denied when executing the jenkins sh pipeline step

I have some trouble with this situation:
everytime I create a new pipeline job ( entitled "pipeline"), the sh step won't work even with simple command like ls or pwd and it returns this log:
sh: 1: /var/jenkins_home/workspace/pipeline#tmp/durable-34c21b81/script.sh: Permission denied
Any suggestions?
I was getting a similar permissions denied error after following the Jenkins pipeline tutorial for a node project.
./jenkins/test.sh: Permission denied
The original pipeline Test stage looked like the following and returned that error.
stage('Test') {
steps {
sh './jenkins/test.sh'
}
}
I found the following post: https://stackoverflow.com/a/61956744/9109504 and modified the Test stage to the following
stage('Test') {
steps {
sh "chmod +x -R ${env.WORKSPACE}"
sh './jenkins/test.sh'
}
}
That change fixed the permissions denied error.
I guess you use
stage(name){
sh ./runSomething
}
Jenkins always uses to user jenkins for running scripts. There are some possibilities:
Jenkins is running with a different user, maybe you start it with some other user.
Something went running when installing jenkins, check that you have a jenkins user
From the terminal you should give permission to this file
sudo chmod -R 777 ./test.sh
and when you push this file it will go with this permissions under the hood, and this way Jenkins will execute this file.
We just need to add "sudo" before the path. I have tested It. It workes perfectly.
stages {
stage('Hello') {
steps {
sh 'sudo /root/test.sh'
}
}
}

Resources