Run java process from Jenkins on remote host - jenkins

I write Jenkins pipeline which in the end will trigger execution of java process on remote host. Currently this last stage looks like:
stage('end') {
sh '''
ssh jenkins#xxx.xxx.xxx.xxx java -jar /opt/stat/stat.jar
'''
}
The process successfully started on remote machine but Jenkins job never ends. Is there any flag telling job must be completed?

Seems like maybe your java command does not exit but stays running? And that's probably the desired behavior? What about putting the process in the background on the remote machine.
stage('end') {
sh '''
ssh jenkins#xxx.xxx.xxx.xxx "java -jar /opt/stat/stat.jar &>/dev/null &"
'''
}

Related

How to execute part of Jenkins pipeline stage outside of Docker container?

We've a containerized Jenkins pipeline and for one of the stages, some part of stage, we want to be executed on container and some on Jenkins master(which is Windows in our case) -
pipeline {
agent {
docker {
label "<node-name>"
image "<docker-image-path>"
}
}
stages {
stage('Testing') {
steps {
script {
//This below part will be executed on container
println "This below part will be executed on container"
sh '''pwd
hostname -i
'''
// Now want to execute below code on master which is Windows
println "Now want to execute below code on master which is Windows"
node('master') {
bat 'dir'
}
}
}
}
}
}
Part to be executed on container is executed successfully but code to execute on Windows Jenkins master fails with -
Cannot run program "docker" (in directory "C:\Jenkins\workspace\TestDocker"): CreateProcess error=2, The system cannot find the file specified
EDIT
And when I've docker installed on Windows machine, above error is not thrown but stucks there forever.
Could you please help me how I can execute code on node or container on demand?

Host key verification failed on jenkins

Context: I am running a shell script on a remote machine through Jenkins. But while running I am getting "Host key verification failed." error on Jenkins log.
code snippet
#!/bin/sh
#Shell script for running the script from jenkin
#Performance Engineering Team
triggerPerformanceTest(){
echo "Starting the Jmeter script"
ssh -tt -i Test.ppk ubuntu#testserver << EOF
cd apache-jmeter-3.1/bin/
JVM_ARGS="-Xms512m -Xmx25000m" ./jmeter.sh -n -t /home/ubuntu/JMeter/Test.jmx
exit
EOF
echo "Test successfully executed"
}
triggerPerformanceTest
I can run the same query from my local machine through code editor(refer the screenshot attached).
Could some help me to resolve this issue? Note: I cannot access to Jenkins server so not able to do anything there.
Your remote machine key is not known by Jenkins. It's not present in ~/.ssh/known_hosts file.
This is a security to prevent man-in-the-middle attacks.
Someone has to add it for you or you will not be able to ssh to the remote machine.

How to run docker container in a remote machine

I am trying to run this jenkins pipeline code via DOCKER. I am using AWS ec2-user as a VM here. This code is working fine, but...
node{
stage('SCM CHECKOUT'){
git 'https://bitbucket.org/rajesh212/myapp.git'
}
stage('MVN BUILD'){
def mvnHome = tool name: 'maven', type: 'maven'
sh "${mvnHome}/bin/mvn clean package"
}
stage('DEPLOYMENT VIA DOCKER'){
def customImage = docker.build("image:${env.BUILD_ID}")
docker.image("image:${env.BUILD_ID}").withRun('-p 9090:8080'){sleep 10000}
}
If I am not giving sleep command then this job ran
successfully but my docker container start and stop immediately. i.e
I am not able to get the output. How to solve this problem?
I want to run this docker image on a remote machine? how to do it?
In order to run on a remote server, you must use the withServer command.
As for the container stopping, try changing the withRun command to withRun('-d -p 9090:8080')
If you are using declarative pipelines, try this ssh command. As a prerrequisite you need to set up a key pair to allow Jenkins to ssh into the remote server. An specific ssh key pair for deployment is recommended for security issues:
stage('Deploy to Production') {
steps{
sh 'ssh -i path/to/deploy_private_key user#DNS_REMOTE_SERVER "docker run -d REGISTRY/YOUR_DOCKER_IMAGE:TAG"'
}
}
Use the -d parameter to run the container in detached mode.
Hope it helps.

how to run docker commands inside jenkins pipeline jobs

In my Manage Jenkins > Global Tool Configuration, i have already configured a tool called "docker" as follows:
name: docker
install automatically: CHECKED
docker version: latest
Then all I have in my jenkinsfile is the following and nothing else:
node {
DOCKER_HOME = tool "docker"
sh """
echo $DOCKER_HOME
ls $DOCKER_HOME/bin/
$DOCKER_HOME/bin/docker images
$DOCKER_HOME/bin/docker ps -a
"""
}
I get an error like this "Cannot connect to the Docker daemon. Is the docker daemon running on this host?".
Following is the full console log:
Started by user Syed Rakib Al Hasan
[Pipeline] node
Running on master in /var/jenkins_home/workspace/helloDocker
[Pipeline] {
[Pipeline] tool
[Pipeline] sh
[helloDocker] Running shell script
+ echo /var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker
/var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker
+ ls /var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker/bin/
docker
+ /var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker/bin/docker images
Cannot connect to the Docker daemon. Is the docker daemon running on this host?
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE
How do i ensure that the docker daemon/service is running/started before my pipeline reaches the line to run docker commands.
Is there any other native docker-build-step plugin way to achieve what I am doing here? Like docker ps -a or docker images or docker build -t?
Some assumptions:
Let's say my chosen node do not already have docker/docker-engine installed/running in my host machine. That's the purpose of the tool command to automatically install docker in the node if it is not already there.
This Jenkins plugin is for the docker client; I'd solve (work around) by:
setting up jenkins slaves where docker daemon is reachable, add a label
setting up a housekeeping job which will fail if docker daemon was not reachable (so we can notify the infra team without having the QA to figure out and escalate the problem)
assign jobs which assumes the docker daemon to be reachable to this label
I hope it helps, and I'm curious if any of you have a better solution!

How to have all Jenkins slave tasks executed with nice?

We have a number of Jenkins jobs which may get executed over Jenkins slaves. Is it possible to globally set the nice level of Jenkins tasks to make sure that all Jenkins tasks get executed with a higher nice level?
Yes, that's possible. The "trick" is to start the slave agent with the proper nice level already; all Jenkins processes running on that slave will inherit that.
Jenkins starts the slave agent via ssh, effectively running a command like
cd /path/to/slave/root/dir && java -jar slave.jar
On the Jenkins node config page, you can define a "Prefix Start Slave Command" and a "Suffix Start Slave Command" to have this nice-d. Set as follows:
Prefix Start Slave Command: nice -n -10 sh -c '
Suffix Start Slave Command: '
With that, the slave startup command becomes
nice -n -10 sh -c 'cd "/path/to/slave/root/dir" && java -jar slave.jar'
This assumes that your login shell is a bourne shell. For csh, you will need a different syntax. Also note that this may fail if your slave root path contains blanks.
I usually prefer to "Launch slave via execution of command on the Master", and invoke ssh myself from within a shell wrapper. Then you can select cipher and client of choice, and also setting niceness can be done without Prefix/Suffix kludges and without whitespace pitfalls.

Resources