In Jenkins what's the differences between Remote access API and Pipeline REST API - jenkins

In Jenkins, we want to get the Pipeline stages information through API, e.g. a stage is success or fail. From another answer it seems we can achieve it through Pipeline REST API Plugin.
My question is:
Can Jenkinsapi and Python-Jenkins achieve the same thing? It seems they're designed for bare metal Jenkins, instead of the Pipeline plugin, right ? If that's the case, do we have similar Python library for Pipeline plugin?
Thanks!

You have all the info to answer your own question in the documentation you linked. Just to put it together:
Jenkins has a REST API
The pipeline REST API plugin injects new endpoints in the Jenkins REST API to give you access to several pipeline info
Python-Jenkins and jenkinsapi you mentioned above are both wrappers around the Jenkins REST API to help you develop scripts/apps targeting the Jenkins API in Python. Since those modules/libs are based most probably based on the core API specification, they most probably don't provide specific methods to target the pipeline endpoints (but you can probably extend that quite easily).

If you want to stay in Jenkinsapi, get_data() defined in JenkinsBase class could be used for querying Pipeline REST API directly. Not very sure if it's recommended or not.
Following codes works for me.
from jenkinsapi.jenkins import Jenkins
import requests
requests.packages.urllib3.disable_warnings()
# GET /job/:job-name/:run-id/wfapi/describe
url = 'https://localhost/job/anyjob/10/wfapi/describe'
jenkins = Jenkins(
'https://localhost',
username='username',
password='api_token',
ssl_verify=False,
lazy=True)
print(jenkins.get_data(url))

Related

Using Rest API to trigger a specific stage within a yaml pipeline

Is there a way to execute a specific stage within a running yaml pipeline which uses an environment with approvals?
I have an on-prem deploy and an on-prem destroy stage both have manual approvals.
What I would like to do is run on-prem destroy stage in the past builds using rest api.
What I achieved so far is get 10 recent builds in descending order for a specific source branch lets call it feature/on-prem-enterprise. Then I do some parsing and find past builds that had a successful deployment but failed, cancelled, or skipped destroy stage, using these results from timeline endpoint, I want to use rest api to run/re-run a destroy stage in those builds.
We get into a situation where we have several deployments but nobody is manually running the destroy stage and because this pipeline is shared amongst all developers for dev builds, its very difficult to find those older builds manually.
If it cannot be achieved, then other solution may be to compile this list of builds and send an email out, but would prefer to have less manual intervention here.
Is there a way to execute a specific stage within a running yaml pipeline which uses an environment with approvals?
The answer is yes.
You could use the REST API Runs - Run Pipeline with below request body to skip other stages to trigger stage which you wanted:
POST https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipelineId}/runs?api-version=6.0-preview.1
Request Body:
{
"stagesToSkip":["Dev","Test"]
}
Postman test result:
And the test result for the pipeline runs:
You can use the Stages - Update REST API method. This is part of the Build resource methods but works fine for YAML Pipelines as well.
PATCH https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/stages/{stageRefName}?api-version=7.1-preview.1
It sounds like you're already getting the buildId programmatically. The stageRefName is the name of the stage as defined in your YAML. Your URI will look something like:
https://dev.azure.com/myorg/myproject/_apis/build/builds/1234/stages/DestroyStageName?api-version=7.1-preview.1
In the request body you'll need:
{
forceRetryAllJobs = $false
state = 1 # state 1 is retry
}
forceRetryAllJobs may be unnecessary. There's an example implementation in PowerShell here.
If you're struggling to identify the appropriate API method to replicate something you do in the Azure DevOps GUI opening your browser's debugger tools and inspecting the requests in the network tab can often help you identify the call that's being used.

Jenkins Pipeline parameter: list of available git repositories

I'd like to create a pipeline, where one of the parameters is a git repository from a list of available repositories. The list would be similar to the one you see when creating a multibranch pipeline with a bitbucket repository.
Is there a plugin that can do this? Is it hard to make one for myself?
I shoild think the relatively new REST List Parameter would work in these circumstances, as long as the repository host supports the plugin authentication mechanisms. It's kind of an HTTP Request extension for Build with Parameters plugin.
Plugin author (not me) is very responsive to positive feeback and useful enhancement requests.

How can a Jenkins input step be completed via APIs?

I have a Jenkins pipeline defined that includes an input step. A human can provide the input by clicking in the Jenkins UI and there is an HTTP endpoint to provide input as well.
Is it possible to provide the input via Groovy API calls? For instance, could a parallel step in the same pipeline provide the input values? Or, could a completely different build provide input values via Groovy code?
The reason I'd like to use Groovy is to keep the input providing entirely in the Jenkins system and avoid having to provide authentication credentials for the HTTP endpoint.
We had a similar problem (one pipeline should be able to trigger input steps in other pipelines).
This worked in the Jenkins script console and should work in a pipeline:
import org.jenkinsci.plugins.workflow.support.steps.input.*
def build = Jenkins.instance.getItemByFullName("TestInputPipeline").getLastBuild()
def action = build.getAction(InputAction.class)
action.getExecutions().get(0).proceed([])
TestInputPipeline is the name of a test pipeline with a single input.
If your input has parameters, you will probably be able to provide them with the map in the proceed call.
This Input Step Plugin test code helped us: https://github.com/jenkinsci/pipeline-input-step-plugin/blob/master/src/test/java/org/jenkinsci/plugins/workflow/support/steps/input/InputStepRestartTest.java
JavaDoc can be found here: https://javadoc.jenkins.io/plugin/pipeline-input-step/index.html

Create a Pull request in bitbucket using Jenkins job

I am interested in creating a pull request on bitbucket using jenkins pipeline job based on groovy. I am creating a jenkins job which is pulling the code then doing some changes and then pushing the code to bitbucket and then I want to raise PullRequest.
Any help would be really appreciated.
Thanks
I'm looking for the same, unfortunately I only find a way to do this using the bitbucket rest apis.
It would be nice to have a plugin to do this, but I could not find it.
This is the rest api
https://docs.atlassian.com/bitbucket-server/rest/7.6.0/bitbucket-rest.html#idp291
/rest/api/1.0/projects/{projectKey}/repos/{repositorySlug}/pull-requests
There are more details here
How to create a pull request in a Bitbucket using api 1.0

Planned Support for Kubeflow Pipeline Uploads via API?

Is it possible to upload a Kubeflow Pipeline using an API call using the Kubeflow Pipelines Python SDK?
There is the following specs for making API calls: https://www.kubeflow.org/docs/pipelines/reference/api/kubeflow-pipeline-api-spec/
However, when I try uploading a pipeline using the route "/apis/v1beta1/pipelines/upload" in PostMan, I get the following error:
There was a problem with your request. Error code 13
I am sure I need to add some Authorization headers, but before I go that hacky route, I wonder if there are plans to add this functionality to the Python SDK. Please let me know if you guys have an idea, or a better way to set up the request for uploading pipelines outside the Kubeflow Pipelines UI!
By now, it's possible indeed to upload a pipeline with the kfp python sdk:
import kfp
client = kfp.Client()
client.upload_pipeline('path/to/pipeline.yaml', pipeline_name=name)
See the docs for a full API reference.

Resources