Setup Jenkins to monitor external job - jenkins

I read the part of the Jenkins wiki that covers setting up a remote job to be monitored by a Jenkins instance. However, the documentation is confusing as it doesn't tell me what to configure on the Jenkins machine or the remote machine (the one that does the job).
Further, the documentation mentions Java commands that can be fired directly and others that need a servlet container. Do I have to install a servlet container on the remote machine?
Maybe it's all there but for me it's like a mix of two documentations. Can you please clarify:
What do I need to do on the remote machine?
What do I need to do on the Jenkins machine?
Thank you.

In Jenkins, you need to create a job using the "Monitor an external job" option. Give this a name, for example "nightly-backup".
On the machine where the external job is running, you need Java installed and some basic Jenkins JAR files, so that the job results can be sent to Jenkins.
As the wiki page says, on some versions of Debian or Ubuntu you can do this with:
sudo apt-get install jenkins-external-tool-monitor
Otherwise, you have to copy a bunch of JARs manually — i.e. those listed on the wiki page — to your remote machine.
Once you have the JARs available on your remote machine, you can execute whichever command you like there, so long as you prefix it with some Jenkins information: where to find the Jenkins installation, the main Java JAR, and the job name:
JENKINS_HOME=http://my-jenkins/ java -jar jenkins-core-*.jar nightly-backup ./backup.sh --nightly /home
Where http://my-jenkins/ is the base URL to Jenkins, nightly-backup matches the name of the "Monitor an external job" you created in Jenkins, and ./backup.sh --nightly /home is the command you wish to run.
The output of this ./backup.sh command will show up in Jenkins automatically once it's complete.

It looks like this is now called "jenkins-external-job-monitor", so you'd type:
sudo apt-get install jenkins-external-job-monitor

Related

Jenkins call script after build success

I want to Jenkins call a shell script from another docker container.
I don't know Jenkins can do this, and what is the best way to make it.
In one machine, I have this containers:
Service
IP (Docker network)
Note
ERP system
172.74.42.2
Odoo 14 ERP system what work with many plugin. If a plugin got new functions or fixes some bugs, need to restart the system after the pull, then update the plugin.(I write a shell script what run in ssh after push commits.
Jenkins
172.74.42.3
In here I want to call the shell script after the build is success
The Jenkins is work. So the connect between the github and jenkins is good.
I try to write shell commands in Jenkins file. But I think this is a dumb way...

Pass binary from Jenkins host to agent

Can you pass a binary from a Jenkins host to an agent?
I've got Jenkins running in Kubernetes, and the terraform plugin installed on my Jenkins master with the binary located at /var/jenkins_home/tools/org.jenkinsci.plugins.terraform.TerraformInstallation/terraform/terraform
I would like to pass this to my Jenkins agent by configuring my pod template and mounting the host volume path /var/jenkins_home/tools/org.jenkinsci.plugins.terraform.TerraformInstallation/terraform/terraform to the agent's path /usr/bin/terraform
But this doesn't seem to work as expected
When I exec into the agent and run a terraform version I get the error bash: terraform: command not found indicating that it doesn't have the binary.
I can see a terraform directory mounted in /usr/bin but without the binary. What I expect is for terraform to be installed on the agent. But my thinking might be incorrect here.
Is it possible to do this, has anyone has any experience with this?
As a #David Maze mentioned binary from Jenkins needs to be manually installed on every node, which can be a difficult to manage. However you can set Jenkins to run pipeline steps inside a container where the image contains the tools you need, which simplifies such case.
Read more: execution-env-jenkins.
One alternative is to use the slaves setup plugin. We use it to install and configure internal tools (and end) on nodes bases on labels. A log less hassle than #Malgorata's (and our previous) manually copy approach
Not sure how well it works with Kubernetes as not in our configuration.

Steps to run Test framework in Docker and Jenkins

Background:
I am a newbie to docker.
I have 2 automation frameworks in my local PC - One for Mobile and other a web application. I have integrated the test frameworks with Jenkins.
Both test frameworks have open Jar dependencies mentioned in Maven pom.xml.
Now i want that when I click on Jenkins Job run to execute tests, my tests should run in a docker container.
Can anyone please give me steps to
Configure Docker in this completer Integrated framework
How to push my dependencies in docker
How to integrate jenkins and Docker
how to run Tests of web and mobile apps in docker on jenkins job click
I'm not a Jenkins professional, but from my experience, there are many possible setups here:
Assumptions:
By "Automation Framework", I understand that there is some java module (built by maven, I believe for gradle it will be pretty much the same) that has some tests that in turn call various APIs that should exist "remotely". It can be HTTP calls, working with selenium servers and so forth.
Currently, your Jenkins job looks like this (it doesn't really matter whether its an "old-school" job "step-by-step" definition or groovy script (pipelines):
Checkout from GIT
run mvn test
publish test results
If so, you need to prepare a docker image that will run your test suite (preferably with maven) to take advantage of surefire reports.
So you'll need to build this docker image once (see docker build command) and make it available in the private repository / docker hub depending on what your organization prefers. Technically for this docker image, you can consider a Java image as a base image, get the maven (download and unzip + configure) then issue the "git pull command". You might want to pass credentials as system variables to the docker process itself (see '-e' flag)
The main point here is that maven inside the docker image will run the build, so it will resolve the dependencies automatically (you might want to configure custom repositories if you have them in settings.xml of maven). This effectively answers the second question.
One subtle point is results that should be somehow shown in Jenkins:
You might want to share the volume with surefire-results folder with the Jenkins "host machine" so that Jenkins's plugins that are supposed to show the results of tests will work. The same idea is applicable if you're using something like allure reports, spock reports and so forth.
Now when the image is ready the integration with Jenkins might be as simple as running a docker run command and wait till it's done. So now the Jenkins job will look like:
docker run pre-defined image -e <credentials for git>
show reports
This is one example of possible integration.
One slightly different option is running docker build as a job definition. This might be beneficial if for each build that image should be significantly different but it will make the build slower.
Following approach can be followed to achieve your goal
Create a docker file with all your setup as well as dependency ( refer)
Install docker plugin on jenkins to integrate the support of docker (refer)
Use Jenkinsfile's approach to pull the docker image or create it by dockerfile and run the test within docker.
below sample code just for reference
node
{
checkout scm
docker.withRegistry('https://registry.example.com', 'credentials-id')
{
def customImage = docker.build("my-image")
docker.image('my-image').inside
{
//Run inside the container
sh 'run test'
}
}
}

Integrating Jenkins with Gitlab

I need to setup a build configuration in Jenkins so that whenever a build is triggered, I get my latest scripts from Gitlab and copy them to the target systems and run that script on the target.
I couldn't find any relevant info for integrating Gitlab to Jenkins. Are there any specific plugins that I could use?
I am using Jenkins version 2.158
Step by Step procedure for doing what you are looking for:
Add the location of the Script from GitLAB. (E.g.)
Run the script over the target machine.
While Building the job, you will get the code at the root (./) of the job's workspace. Copying and running the script over the target machine can be done by remote script executions. following are the cases we having in running script in the remote machine
Windows (jenkins) to windows - use psexec.exe
Windows (Jenkins) to linux - use plink.exe which is command line putty
Linux (Jenkins) to linux - use SCP and SSH
Linux to Windows - use ansible for windows.
E.g,.
$ scp script.sh remote_username#10.10.0.2:/remote/directory
$ ssh -t remote_username#10.10.0.2 /remote/directory/script.sh
All the best.
Integration between Git Repository Management (github, gitlab,bitbucket, etc) and Jenkins has the following steps :
Developer push some source code (java, php, nodejs, etc) to the Git Repository Management.
The Git Repository Management detects this event and notify to some public http endpoint in your Jenkins. Currently webhook is the most recommended way to implement this notification.
Jenkins receive the http post request(from bitbucket for example) and using some plugin or configurations , Jenkins try to determine or get the basic devops parameters like : branch name, commit author, commit message, technology, etc
Whit the extracted devops parameters, Jenkins launchs a preconfigured job. This job use the previously extracted values to build, compile, zip, install or to do whatever is necessary to startup your application.
If you want to implement this flow, check this post:
https://jrichardsz.github.io/devops/devops-with-git-and-jenkins-using-webhooks
Also, if you need. I will gladly to show you a basic integration using some git repository management and jenkins . Just contact me.

Is it possible to integrate SonarQube, Jenkins and GitLab (all in dockers)?

Currently, I am working in a quality process so as to ensure that the code is acceptable. For that, I'm integrating Jenkins, SonarQube and GitLab, which are running in different servers (actually they are in different docker containers).
The idea is to check with SonarQube everytime the code is pushed against GitLab and block commits, merges, and so on, whether SonarQube has not passed.
I have already integrated Jenkins with SonarQube, but Jenkins checks the code inside his workspace, so imagine a situation where a developer in his laptop needs to push his changes.
My conceptual question is simple: Is it possible to integrate these technologies in order to do this? And, if the question is yes, which steps are necessary?
PD: I don't need to see code, configuration files,and so on. I just need something like:
Configure SonarQube to work with Jenkins
Do an script so as to copy that file in that folder,
...
First, in docker means each tool is in its own container.
They only need to see each other through the network, which is where a Docker Engine in Swarm mode comes in.
Second "configure Jenkins to work with SonarQube"... that is what I have done in my shop, and there isn't much to it.
Once the Jenkins SonarQube plugin is installed, and the address for the SonarQube server entered, you can configure your job and call sonar (for instance with maven: $SONAR_MAVEN_GOAL -Dsonar.host.url=$SONAR_HOST_URL)
The analysis done in the Jenkins workspace will then be published in the SonarQube server.
A swarm server is the more modern version of this 2015 docker-compose.yml file from the marcelbirkner/docker-ci-tool-stack project.
The idea remains the same though: each element is isolated in its own container.
I haven't tried It myself but https://gitlab.talanlabs.com/gabriel-allaigre/sonar-gitlab-plugin could be interesting in your setup.

Resources