Each finished build in jenkins has a build.xml file (in work/jobs/...BuildName.../builds/...BuildNumber... with a lot of info about the build. Is it possible to access that file using the rest api? I tried a lot of variations, but I could not find it.
Look in the Jenkins itself for the documentation.
If you access the URL
http://SERVER/job/JOB/api/
You will see the way to use REST api, which can access all elements of your Jenkins (including parameters and logs from the build).
I hope this helps.
Related
I'm automating the jenkins installation along with plugin installation.As part of that i want to get the suggested plugins from jenkins server itself or install the same via api or cli instead of passing a list of plugins manually
Example: I want to get the below page plugins in a file or want to install the same via rest api or cli
Is it possible?
Note:
As of now,i'm pulled some important plugin names from websites/github accounts and just passing the plugin names manually by keeping them in a file.
I have set up a CI project with Eclipse, Jenkis, Git, Maven, Sonar and Nexus.The idea is that files are generated from the sonar analysis as part of the build to be able to process those files and send them to Elasticsearch. The problem is that I do not know how I can dump that data that I see through SonarQube to local files and send those logs to kibana
The best way to access such data, if you need them for further processing, would be to use SonarQube's web api. You can check the doc on your SQ instance or you can use this public instance as a reference https://next.sonarqube.com/sonarqube/web_api
I am using Free Style Projects (in Jenkins) to schedule a regression test.
1. Get Source From BitBucket
2. Execute Windows Batch Command.
Earlier we are allowed to upload the jar files in Bitbucket. So we did not face any issue. Now Presently due to some changes in the process, we are not allowed to upload binaries which is affecting to upload jars in the Bitbucket.
Now, They gave given the artifactory url to set up for Maven. But we don't have any Maven projects.
It seems that artifactory is getting populated when it is hosted in the local. But we wanted to use the artifactory which is shared..
Can any one let me know the set up for free style project and the artifactory hosted in other machine and we have only URL.
Thanks
Here is documentation:
https://www.jfrog.com/confluence/display/RTF/Jenkins+Artifactory+Plug-in
I recommend to use Maven Project.
How can I determine what version of the declarative pipeline syntax is available to me when writing a Jenkinsfile if I do not have the permissions needed to list the plugins in the Jenkins server?
I am trying to write a Jenkinsfile using the declarative syntax, but cannot tell if the errors I encounter are because I am misinterpreting the documentation or if I am referencing documentation for a newer version than available on the box.
I have access to Jenkins server to configure and run a build. However, I have no management or script permissions over Jenkins itself, so none of the options on How to get a list of installed jenkins plugins with name and version pair? worked for me.
Perhaps there is a way to make my Jenkinsfile echo the relevant version information as part of its execution? Or some easy syntax tests that (based on pass/failure) would prove which syntax version is available?
This might be helpful, depending on version of Jenkins, but a note at the bottom said this was blocked more than a year ago. So there probably isn't a way to do it without admin access. You can use the REST API, CLI, or script console, but all fo these require admin access.
https://support.cloudbees.com/hc/en-us/articles/218756317-How-can-non-admin-users-view-the-installed-plugins-
My development environment consists of Git repository, GitLab repository manager and Jenkins. During build process a documentation is generated with Doxygen in HTML format.
Is it possible to store that documentation on GitLab project's wiki? I know that HTML is not supported in Gollum, which is the Gitlab's wiki engine. Converting HTML to Markdown is not satisfactory because of internal links in HTML files that point to other HTML files.
Should I store documentation in a separate wiki instead and only commit a link to GitLab project wiki?
I guess the answer depends on what you use your HTML documentation for and how much you distribute it.
First of all, which version of Gitlab are you using ?
If it is only for your developers to get access to (aka these are private documents), then on Gitlab >= 8.1.2 there is a service called External Wiki that allows you to substitute the Wiki link of the project by any URL of your choice. Just set up a web server serving your html documentation, have your build server upload the newest version after each build and call it a day.
If your documentation is a code that you want to version control and distribute, then do so.
Whether you go for 1. or 2., it is still a good idea to have a the whole documentation in a separate git repository, because you get compression for free and using git pull is much better than using any rsync to synchronise any local or remote directories. Then is just a matter of setting up shell script or git hooks to automatically do all of it for you; either at commit or build time.
Hope this helps !