Jenkins JSON API - Locate Build, Build Environment, and Build Trigger API data - jenkins

I am working with the Jenkins API JSON.
I understand the format to retrieve API data in JSON
<Jenkins_URL>/job/<job_name>/api/json
Within the job/<job_name>/configure UI we can configure/add Build triggers, build env, and build data.
I want to be able to view the Build, Build Env, and Build Triggers data in a JSON API.
Is it even possible to get said data? What are alternative ways to get all available data that is found in the configure page of a job?

I think the most straightforward way is to access <Jenkins_URL>/job/<job_name>/config.xml.
Yes, it's not JSON, but you can be sure that this contains everything that was configured on the configuration page.
The XML file is the "native" serialized version of the Job configuration. The JSON API will always require some additional glue that may exist or not exist.

Related

How to keep Postman collections and tests in sync with swagger/open api specs and git in a CI flow

We are investigating whether we can incorporate Postman Test Runner and Newman into an API testing flow with our Jenkins CI server.
My question is this: once I import a swagger/Open API file into a Postman collection, how can I keep changes from multiple team members in sync?
For example, if a team member adds a new api endpoint to the swagger file, do we have to re-import the swagger file into a postman collection overwriting it? We'd like to keep using the swagger file as the single source of truth so would like to keep that in sync with the postman collection. Ideally we would update the swagger file, commit it to git and get its changes synced to postman.
What about tests created in postman? Is there a way to keep that checked into git? Would we have to export the collection after each test change and check that into git, and re-import changes to postman collections after each git pull?
It looks like since some of the online postman features are built for sharing - the idea is that you would make a change to the collection directly in the postman client and that gets shared out to other postman clients? If so, is there a hook that can be added to sync those changes to git automatically?
To answer your questions in order:
Once I import a swagger/open api file into a postman collection, how
can I keep changes from multiple team members in sync?
To keep your collections in sync everyone will need to sign in and use a team workspace. As of Postman 6.2 a single team workspace is now free.
https://www.getpostman.com/docs/v6/postman/workspaces/intro_to_workspaces
For example, if a team member adds a new api endpoint to the swagger file, do we have to re-import the swagger file into a postman collection overwriting it?
Depends on how you are generating the file. If it is being generated using a run-time tool (e.g., NSwag, Swashbuckle) then you'll most likely end up needing to overwrite the file. If you have a swagger.json your team is directly maintaining, you can probably modify the scripts Postman provides to keep your definitions in sync with Postman: http://blog.getpostman.com/2018/03/02/sync-your-specs/
What about tests created in postman? Is there a way to keep that checked into git?
Yes. You can export Postman collections which include your tests and check those into Git.
If so, is there a hook that can be added to sync those changes to git automatically?
Answered on SO here.

Jenkins job to read data from SQL DB

I'm new to Jenkins. I have a task where I need to create a Jenkins job to automate builds of certain projects. The build job parameters are going to be stored in the SQL database. So, the job would keep querying the DB and it has to load data from the DB and perform the build operation.
Examples would be greatly appreciated.
How can this be done?
You have to transform the data from available source to the format expecting by the destination.
Here your source data available in DB and you want to use this data in Jenkins.
There might be numerous ways but the efficient way of reading data is using EnvyInJect Plugin.
If you were able to provide the whole data as Properties file format and type to EnvyInject plugin, the whole data is available as environment variables you can use these variable in the Jobs configuration.
EnvyInject Plugin can read this properties file from the Jenkins Job Workspace. And you can provide that file path in Properties File Path input.
To read the data from source and make available as properties file.
Either you can write a executable or if your application provides api to download the properties data.
Both ways to be executed before the SCM step, for this you have to use Pre-SCM-Step
Get the data and inject the data in pre-scm-step only, so that the data available as environment variables.
This is one thought to give gist for you to start. while implementing you may get lot of thoughts to implement according to your requirement.

Jenkins API - buildWithParameters and fileParameter

so we're trying to use the Jenkins API to invoke a build with a file parameter. Basically we're trying to give it the files to build in ZIP format. We have a ZIP plugin installed to unzip the build file, but we can't get that far.
Basically we're trying to use the buildWithParameters endpoint, but based on scouring info available online combined with a test I have set up in Postman, it would seem that buildWithParameters only works with parameters that exist on the query string.
If we use the build endpoint, we're able to submit the file successfully, both in a node app we built (using the request library) and in Postman. But if we roll back to buildWithParameters, using the same exact configuration, the file is not processed successfully by Jenkins (we get a file not found error).
Am I right about buildWithParameters only working with query string parameters?
To preempt why not just use the build endpoint: we need the build number back, which we get back from buildWithParameters but don't seem to get back with build.
To reiterate my question: is it possible to use the buildWithParameters endpoint and upload a file parameter? It would seem like it is, but we have not been able to get it to work.

How to send additional data elements in Jenkins Notification Plugin?

I use Jenkins Pipeline Jobs and invoke build using its remote API's.
I also use Build Notification plugin to invoke my API once the build is complete for further downstream automation. As per the plugin documentation, it provides a fixed set of data elements from the build. However as part of the build, the Job has generated some data elements which I need to provide it back to my API which gets invoked by Notification plugin (As part of the JSON Payload). Can someone help me how do pass additional data elements through this plugin? or any better ways of doing it?
For example,
1. When the pipeline job is configured with notification endpoint, the jenkins config XML has the following entry
<com.tikal.hudson.plugins.notification.HudsonNotificationProperty plugin="notification#1.11">
<endpoints>
<com.tikal.hudson.plugins.notification.Endpoint>
<protocol>HTTP</protocol>
<format>JSON</format>
<url>http://localhost/api/postStatus</url>
<event>finalized</event>
<timeout>30000</timeout>
<loglines>20</loglines>
</com.tikal.hudson.plugins.notification.Endpoint>
</endpoints>
</com.tikal.hudson.plugins.notification.HudsonNotificationProperty>
A pipeline script just builds an image and the image ID has to be sent in the notification
I did not find a perfect solution in the existing Jenkins Notification Plugin. However the solution that I used it to pass the data as part of the Log Text and parse the information in the other side.

Swagger and Jenkins integration

So we have Swagger UI and YAML file manually generated by a developer. The plan is to use Jenkins to validate our API endpoints (request and response schemas) using the Swagger schema. Is there a way to do that?
Please check Sagger Diff. This CLI tool shows breaking changes between 2 different swagger json files
http://swagger.io/using-swagger-to-detect-breaking-api-changes/

Resources