Update Jenkins Plugins via Artifactory - jenkins

I want to update Jenkins plugin via Artifactory.
Create a remote repo named Jenkins-update
Create a local repo named jenkins-update-center
Get the update-center.json from repo Jenkins-update to local and modify the URL from 'http://updates.jenkins-ci.org/' to my own URL 'https://artifacts.xxx.com/artifactory/Jenkins-update/' in update-center.json, then put update-center.json into local repo.
#!/bin/sh
curl -L -o /tmp/update-center.json http://localhost:8081/artifactory/Jenkins-update-cache/update-center.json
sed -i 's#http://updates.jenkins-ci.org/#https://artifacts.xxx.com/artifactory/Jenkins-update/#g' /tmp/update-center.json
curl -L -uuser:pass -T /tmp/update-center.json "http://localhost:8081/artifactory/jenkins-update-center/update-center.json"
Change the default update site from 'http://updates.jenkins-ci.org/' to 'https://artifacts.xxx.com/artifactory/jenkins-update-center/update-center.json' in Jenkins
There is an error 'SHA-512 digest mismatch: expected=49a22dc23f739a76623d10128b6803f79e0489de3ded0f1d01f3dfba4557136c7f318baaf4749a7713ec4b3f56633f2ac3afc4703e87d423ede029d68f84c74d in 'update site 'default''' when I click 'check now' button.
What should I do to make Jenkins update plugins from Artifactory?
Tkx

As soon as the content of update-center.json changed you need to re-generate "signature" section of this file.
For that you need to generate your key pair (see more details in How to create a local mirror of public Jenkins update site?)
Also you may use the following proposed approach :
there is probably a better way, by having a sandbox Jenkins on a system that has access to the internet. You update the server using the UI and then you can test that updated Jenkins thoroughly. When done, you just need to copy the war and hpi files over to your 'production' Jenkins. now you have even a nice process and QA in place.
Another way is to setup a transparent https proxy between your Jenkins and Artifactory server - in that case update-center.json will not change and signature verification should work fine.
With best regards,
Dmytro Gorbunov

As of 2023-01-10 there is a problem with making a mirror of the jenkins plugins on artifactory.
Artifactory documentation decribes only how to create a mirror: https://jfrog.com/knowledge-base/how-to-configure-artifactory-as-a-mirror-for-jenkins-plugins/
But this is not a complete solution. Because this leads to the situation when every plugin shall be manually updated. Having plugins with bunch of dependencies it is huge effort.
There is a need to generate a file: update-center.json
There is an internal jenkins tool to do this: https://github.com/jenkins-infra/update-center2, but documentation is poor and contains vague statements like:
With a few modifications it could easily be used to generate your corporate update center as well.
Without clear description, what shall be done.
I tried to follow steps and completely failed. Tool require some special environment variables, which are also not documented and so on.
So as of my experience mirroring jenkins plugins on artifactory is practically not possible. And honestly spoken, I would like to be wrong here.

Related

How to use a GitLab link for applying jenkins.yml file for the concept of Jenkins Configuration as Code

I have a local instance of Jenkins. I have previously tried storing the jenkins.yml in my system and giving its path on http://localhost:8080/configuration-as-code. This worked but I want to use a Gitlab repository to store the jenkins.yml file.
I have already tried giving the gitlab link of my jenkins.yml in the path or URL textbox. Some weird things happened, like
1. jenkins broke or huge error console
2. It reapplies the previous configuration(from system path)
jenkins:
systemMessage: "Hello, world"
Your problem as described: you want the job configuration to be saved in GIT and, when a build is triggered, the job should get the current stand of its configuration from there and then, run the build.
Maybe there is a kind of plug-in that does it for you, but I am not aware of any. Maybe anyone?
My suggestion is to define a pipeline job and use a declarative pipeline. It is a file, normally named Jenkinsfile that can be stored in GIT. In the Job, you define the GIT address and when you trigger a build, the file is got from GIT and executed.
There are several flaws in this: pipelines learning curve is not small, you are confronted with groovy (not XML!) and your current XML file is barelly useful.
Maybe someone shows up and tells us about new (for me) plugin that solves your problem using the configuration XML file. In the other hand, pipelines are such a beautyful feature that I encourage you to give it a try

configuring system properties for Jenkins service

Background
I have the following Jenkins config.
Ubuntu machine
Jenkins installed using apt-get, and is started as a service (service jenkins start).
To this point I have not made any modifications to Jenkins config.
We have several Ant projects for which I want to publish Javadocs using Jenkins.
After configuring the Javadoc plugin, I quickly hit this issue where only the Javadoc frames are displaying, without any content.
Some reading (here and here) told me that I need to configure Jenkins' Content Security Policy, and that this is done by modifying system properties passed to Jenkins.
However, despite digging around I have not found clear docs on how to pass these system properties to the Jenkins service. How do I do that?
Answering my own question.
To set system properties for the Jenkins service:
Steps
Stop Jenkins (service jenkins stop). You will need root privileges.
Edit the /etc/defaults/jenkins file.
Add an additional line for the JAVA_ARGS that you want to pass.
JAVA_ARGS="-Dhudson.model.DirectoryBrowserSupport.CSP=\"your CSP configuration here\""
Start Jenkins (service jenkins start).
Explanation
Look at /etc/init.d/jenkins for a line similar to:
NAME=jenkins
SCRIPTNAME=/etc/init.d/$NAME
[ -r /etc/default/$NAME ] && . /etc/default/$NAME
These tell us that the Jenkins daemon will look for a file named /etc/default/jenkins. If present, it .s that file.
If you set $JAVA_ARGS in /etc/default/jenkins it will be substituted in the line below, located later in the /etc/init.d/jenkins file:
$SU -l $JENKINS_USER --shell=/bin/bash -c "$DAEMON $DAEMON_ARGS -- $JAVA $JAVA_ARGS -jar $JENKINS_WAR $JENKINS_ARGS" || return 2
Notes
Even after you do the above, the Javadoc may not load properly. Try doing a hard refresh (Ctrl-Shift-R on Chrome).
As detailed in (the Jenkins docs)[https://wiki.jenkins-ci.org/display/JENKINS/Configuring+Content+Security+Policy] there is a temporary way to do this as well. Read that page and try to understand the implications well.
Changing the Content Security Policy has serious implications especially if your Jenkins is public. It's worth the effort to understand just what policies you are modifying.

in jenkins, how to copy artifacts from another server?

I have another project from which I need to copy artifacts.
However the problem I have is that it's from another server. Is there a way to do so with the copy artifact or I'll have to go through code?
You can accomplish by either publishing your artifact and using either file transfer or secure shell.
Here is info to read upon:
Jenkins Secure Shell Plugin
Jenkins FTP Plugin
The only other possibility is to modify the ant or maven project config file.
Here is a More Reference along the same lines.
I used a wget to fetch the file in the end, with fixed paths.
This link can help for someone not used with wget.
Using wget to recursively fetch a directory with arbitrary files in it
For a long time I use this python script to download artifacts from Jenkins. It takes advantage of the JSON API layer available to any Jenkins job. The format of that API call is:
http://_YOUR_BUILD_HOST_/job/_JOBNAME_/lastSuccessfulBuild/api/json
Beware script depends on PyCurl.
Publish over ssh plugin can also be used for copying the files/artifacts from one server (local/linux) to another server. It has retries option also in case there is network issue and no. of retires and timeout also can be configured.

How to download a file from the jenkins job build folder

I have a jenkins server running, and for a job I need to download a file which is in the jobs/builds/buildname folder.
How to download that file from jenkins job?
If you would use the workspace as suggested by previous post, you can access it within a Pipeline:
sh "wget http://<servername:port>/job/<jobname>/ws/index.txt"
Or inside a script:
wget http://<servername:port>/job/<jobname>/ws/index.txt
Where index.txt is the file you want to download.
I rock a Unix based development machine and a Unix based Jenkins machine up in the cloud. This means I can use the SCP Command to download the remote file over an ssh connection. This is the anatomy of my scp commands:
scp -i <path/to/ssh.pem/file> <user>#<jenkins.remote.url>:<path/to/remote/file> <local/path/where/download/goes>
This works for directories too, for instance I use this to download backups generated by the ThinBackup Plugin
You had already been given the answer for getting the file from the workspace
http://<servername:port>/job/<jobname>/ws/filename.ext
Obviously replace stuff in <..> with values relevant to your setup, and make sure anonymous user has access to read from workspace, else you may have to login.
The only other files you could access are those that are archived from previous job runs.
http://<servername:port>/job/<jobname>/<buildnumber>/artifact/filename.ext
Where <buildnumber> is the build number you see in job build history, or one of the permalinks provided by Eldad (such as lastStableBuild). But this will only have access to archived artifacts.
You cannot arbitrarily access files from Jenkin's filesystem through the web interface... it wouldn't be very secure if it did let you.
The Jenkins job's build folder is meant for logging and plugins reports. You should not need to access it directly.
If you must, you can access it relative to the workspace: $WORKSPACE/../builds/$BUILD_ID/
You can also replace the $BUILD_ID with one of the links Jenkins creates:
lastFailedBuild
lastStableBuild
lastSuccessfulBuild
lastUnstableBuild
lastUnsuccessfulBuild
I hope this helps.
As others have pointed out this path should work, I like to highlight that the "ws" is a directory in Jenkins:
http://<servername:port>/job/<your job>/ws/<your file>
Download the Package lynx (Command line browser)
$ apt-get install lynx
or
$ yum install lynx
then use the command
# lynx http://<servername:port>/job/<jobname>/ws/file
The App Will Ask you to allow cookies and if there are authentication will direct you to login page like the browser.

How to install plugins in jenkins, with the help of jenkins remote access API?

I would like to know, how can I install a plugin to Jenkins, using the Jenkins Remote
access API?
I found a way to install using jenkins CLI. But I need to know how to do the same using API.
I tried using jenkins-python library. But I did not find any way to
install plugin there.
Send (HTTP POST) the following xml data (with your plugin-id#version) to Jenkins plugin manager. Check out my jenkins install plugin script on gist.
This HTTP POST request install jenkins git plugin 2.0.
curl -X POST -d '<jenkins><install plugin="git#2.0" /></jenkins>' --header 'Content-Type: text/xml' http://localhost:8080/pluginManager/installNecessaryPlugins
Some plugins are hard to update on the file system because others depend on it (credentials is one example). For such plugins it is only possible to update them using the web interface.
Jenkins frontend has a page under 'Manage Jenkins' -> 'Manage Plugins'. Under the 'Advanced' tab is a form to 'uploadPlugin'. It allows web automation with curl, you might need to add authentication.
curl -i -F file=#pluginfilename.hpi http://jenkinshost/jenkins/pluginManager/uploadPlugin
In addition to the methods already mentioned (I personally used the "curl uploadPlugin" one provided by #bbaassssiiee), you need to consider that if you use pluginManager Jenkins will try to load your plugin dinamically, but in case you need to restart Jenkins to initialize the plugin properly (this was my case), you should add:
curl -kX POST https://${JENKINS_URL}/safeRestart
In case you copy the plugin directly to jenkins/plugin, the restart is mandatory for the plugin to be loaded.
As suggested by malenkiy_scot, we can create a job and use the Jenkins CLI. Here is the secret way I do for my automation in installing plugins. Jenkins plugins are available in the Jenkins mirror here: http://updates.jenkins-ci.org/latest This link might not list anything but you can download the plugin if you know the name of the plugin. For example, if you want to download the skype-notifier plugin, you can download it from http://updates.jenkins-ci.org/latest/skype-notifier.hpi The generic URL is "http://updates.jenkins-ci.org/latest/.hpi"
After downloading that plugin, it should go to the "plugins" directory in Jenkins home on the server. For linux machine, it will most likely be in "/var/lib/jenkins/plugins". Simple example
wget http://updates.jenkins-ci.org/latest/skype-notifier.hpi
mv skype-notifier.hpi /var/lib/jenkins/plugins
There are two things to note here:
If the plugin has any dependencies, those will not be installed by default. If you know what other plugins are required, those can be installed the same way. A bit of manual process is required here. But if a same set of plugins are required, the dependency can be resolved just once and script can be written to download and move them to the Jenkins home.
Downloaded plugins cannot be used right away. A reload of Jenkins is required.
After a lot of blood sweat and tears my suggested solution is:
Download the hpi files (plugin and dependencies) using plugin-installation-manager-tool
(requires java) or install-plugins.sh (requires bash only, but is officially deprecated, though still working 09/2021)
Note: Both are also contained in official docker image (see also Offline Installations)
Then install all downloaded files via
curl -i -F file=#plugin.hpi http://${JENKINS_URL}/pluginManager/uploadPlugin
Why?
POSTing to /pluginManager/installNecessaryPlugins always installs latest version (known bug or feature?) and seems to only install the requested plugin without proper dependency handling.
Simple example
Requires install-plugins.sh and its dependency jenkins-support from jenkinsci/docker.
You have do adapt install-plugins.sh line 27 to point to your jenkins-support file, e.g.
. jenkins-support if you have everything in one folder and execute it from there.
pluginFolder=$(mktemp -d)
# Download plugins
JENKINS_UC=https://updates.jenkins.io REF="${pluginFolder}" \
install-plugins.sh \
docker-workflow:1.26 docker-plugin:1.2.2
# add more plugins in here, pass a bash array or load from file
# (see Real-life example bellow)
# Install all downloaded plugin files via HTTP
for pluginFile in "${pluginFolder}/plugins"/*; do
curl -i -F "file=#${pluginFile}" http://${JENKINS_URL}/pluginManager/uploadPlugin
done
Real-life example
Taken from cloudogu/gitops-playground.
download-plugins.sh - loads all plugins declared in plugins.txt using install-plugins.sh to a directory passed as parameter.
init-jenkins.sh calls download-plugins.sh, then installs the plugins using jenkins-REST-client.sh
I do not think this is possible. However, as a workaround you may consider creating a job that would install plugins via Jenkins CLI; you then can invoke that job via the API with appropriate parameters.

Resources