Planned Support for Kubeflow Pipeline Uploads via API? - kubeflow

Is it possible to upload a Kubeflow Pipeline using an API call using the Kubeflow Pipelines Python SDK?
There is the following specs for making API calls: https://www.kubeflow.org/docs/pipelines/reference/api/kubeflow-pipeline-api-spec/
However, when I try uploading a pipeline using the route "/apis/v1beta1/pipelines/upload" in PostMan, I get the following error:
There was a problem with your request. Error code 13
I am sure I need to add some Authorization headers, but before I go that hacky route, I wonder if there are plans to add this functionality to the Python SDK. Please let me know if you guys have an idea, or a better way to set up the request for uploading pipelines outside the Kubeflow Pipelines UI!

By now, it's possible indeed to upload a pipeline with the kfp python sdk:
import kfp
client = kfp.Client()
client.upload_pipeline('path/to/pipeline.yaml', pipeline_name=name)
See the docs for a full API reference.

Related

Bitbucket scriptrunner API endpoints to automate?

I'm looking for Bitbucket scriptrunner API endpoints to automate enabling the merge checks available via this plugin in Bitbcuket.
Is there any documentation or steps?
Otherwise, I use the API's for other automations in Bitbucket here:
https://docs.atlassian.com/bitbucket-server/rest/7.2.3/bitbucket-git-rest.html
I tried with this but doesn't seem to work:
requests.get('https://stash.com/rest/api/1.0/projects/project-x/repos/test/scriptrunner')
Thanks
Take a look at Script REST Endpoints https://scriptrunner.adaptavist.com/4.3.4/bitbucket/rest-endpoints.html

Create a Pull request in bitbucket using Jenkins job

I am interested in creating a pull request on bitbucket using jenkins pipeline job based on groovy. I am creating a jenkins job which is pulling the code then doing some changes and then pushing the code to bitbucket and then I want to raise PullRequest.
Any help would be really appreciated.
Thanks
I'm looking for the same, unfortunately I only find a way to do this using the bitbucket rest apis.
It would be nice to have a plugin to do this, but I could not find it.
This is the rest api
https://docs.atlassian.com/bitbucket-server/rest/7.6.0/bitbucket-rest.html#idp291
/rest/api/1.0/projects/{projectKey}/repos/{repositorySlug}/pull-requests
There are more details here
How to create a pull request in a Bitbucket using api 1.0

In Jenkins what's the differences between Remote access API and Pipeline REST API

In Jenkins, we want to get the Pipeline stages information through API, e.g. a stage is success or fail. From another answer it seems we can achieve it through Pipeline REST API Plugin.
My question is:
Can Jenkinsapi and Python-Jenkins achieve the same thing? It seems they're designed for bare metal Jenkins, instead of the Pipeline plugin, right ? If that's the case, do we have similar Python library for Pipeline plugin?
Thanks!
You have all the info to answer your own question in the documentation you linked. Just to put it together:
Jenkins has a REST API
The pipeline REST API plugin injects new endpoints in the Jenkins REST API to give you access to several pipeline info
Python-Jenkins and jenkinsapi you mentioned above are both wrappers around the Jenkins REST API to help you develop scripts/apps targeting the Jenkins API in Python. Since those modules/libs are based most probably based on the core API specification, they most probably don't provide specific methods to target the pipeline endpoints (but you can probably extend that quite easily).
If you want to stay in Jenkinsapi, get_data() defined in JenkinsBase class could be used for querying Pipeline REST API directly. Not very sure if it's recommended or not.
Following codes works for me.
from jenkinsapi.jenkins import Jenkins
import requests
requests.packages.urllib3.disable_warnings()
# GET /job/:job-name/:run-id/wfapi/describe
url = 'https://localhost/job/anyjob/10/wfapi/describe'
jenkins = Jenkins(
'https://localhost',
username='username',
password='api_token',
ssl_verify=False,
lazy=True)
print(jenkins.get_data(url))

Is there a way to track usage of a global shared library in Jenkins?

Context:
At my work most developers are free to write their own Jenkinsfile for their own team's projects.
As the Jenkins admin, I provide developers with a global shared library.
Most projects are using either v1 or v2 or v3 or another version of this library, using the idiom library("theSharedLib#v#").
Question: Is there a way for me to find out which Jenkinsfile is using which version of the shared library without having to actually lookup into all those Jenkinsfile files (50+ files in as much git repos)?
What I would see best is some mechanism that write up (into a file on the Jenkins master or in a DB) which project/Jenkinsfile is using which version at the time the library is loaded.
A possible solution would be to add some code to every function inside the library that will actually do this reporting. I could then see which function is used by who. Any better solution?
I wrote https://github.com/CiscoDevNet/es-logger to gather information such as this from Jenkins. It has a plugin that will run a regex against the console log of a completed job and can then post events to elastic search.
Jenkins helpfully posts library loads at the start of the log such as:
Loading library sharedLib#version
So a simple regex like
"^Loading library\S+(?P<library_name>.*?)#(?P<library_version>.*?)\S+$"
added to the console_log_events plugin would generate events in an elastic search for each usage and each version.

How do I integrate a swagger file with aws_apigateway

I want to use aws_apigateway and use a swagger file to define the api, how do I code this using AWS CDK either in python or TypeScript?
There is a workaround here, but so far (22/6/2022) is still in the CDK roadmap (issue ref)
The workaround involves some manual steps and initial swagger extraction from CDK init, then it can be fed in somehow.

Resources