Trigger "perform maven release" of a jenkins job from another job - jenkins

Looking for ways to trigger a "perform maven" release job from another jenkins job. It can be a rest api (or) a plugin that can do it. I saw posts about "trigger paramterized" plugin which can do this, but I cant see a way to do it . So I need real examples on how to try it.
Thanks!

This task has been open in Jenkin's Jira since July 2015 with no movement yet.
Since this is the case, I suggest using an HTTP POST to accomplish this task. To do this, you will need to do the following:
Install the HTTP Request Plugin
Create an httpUser (or use an existing one) with the appropriate Matrix Permissions and then grab its API Token Jenkins -> People -> httpUser -> Configure -> API Token -> Show API Token...
Jenkins -> Manage Jenkins -> Configure System -> HTTP Request -> Basic/Digest Authentication -> Add -> create a Global HTTP Authentication Key with the information from step 2
Create a "parent" job that will trigger other Jenkins job(s) via the M2-Release-Plugin and configure it as follows:
This build is parameterized
releaseVersion (Text Parameter)
developmentVersion (Text Parameter)
(add other parameters as required, look in the doSubmit method for details)
Build -> Add build step -> HTTP Request
URL (should have this format) = http://JenkinsServerName/job/JenkinsJobName/m2release/submit
HTTP mode = POST
Advanced...
Authorization -> Authenticate = select the Authenticate option created in step 3
Headers -> Custom headers -> Add
Header = Content-type
Value = application/x-www-form-urlencoded
Body -> Pass build params to URL? = Yes
Request body = (your parameters from step 5 and a json parameter object with any additional parameters required)
Response body in console? = Yes
These are the steps I followed to have one Jenkins job trigger an m2release on another job in my environment. Hopefully this helps others and should I lose my notes or memory, I can refer to this post as well.

Related

Gitlab Webhooks (2 Webhooks on the same repo)

So heres the thing ,
We have two webhooks setup on the same repository in Gitlab ,
Webhooks number 1 is set to url : http://jenkins.local/project/job1 (build job from master branch )
Webhooks number 2 is set to url : http://jenkins.local/project/job2 (builds job from branch "1" )
The issue we're trying to overcome is , whenever there is a mergre request being opened
Both of those web hooks are being triggered .
Is there a way to "configure" the webhooks to fire only when a merge reuqest is being made into the master / 1 branch ,
i haven't found such settings in settings -> integrations
Webhook settings info
Currently, the option to restrict webhooks per branch is only available for Push events; for Merge requests events; there isn't a way to restrict/filter.
You have to filter it in your Jenkins job (which job to get fired; if that's also you looking for) as an example of GitLab plugin like this -
Job1:-
triggers {
gitlabPush {
buildOnMergeRequestEvents(true)
targetBranchRegex('master')
}
}
Job2:-
triggers {
gitlabPush {
buildOnMergeRequestEvents(true)
targetBranchRegex('branch1')
}
}

Bitbucket Jenkins plugin constructs wrong push URL

We use Bitbucket server and want to trigger a Jenkins build whenever something is pushed to Bitbucket.
I tried to set up everything according to this page:
https://wiki.jenkins.io/display/JENKINS/BitBucket+Plugin
So I created a Post Webhook in Bitbucket, pointing at the Jenkins Bitbucket plugin's endpoint.
Bitbucket successfully notifies the plugin when a push occurs. According to the Jenkins logs, the plugin then iterates over all jobs where "Build when a change is pushed to BitBucket" is checked, and tries to match that job's repo URL to the URL of the push that occurred.
So, if the repo URL is
https://jira.mycompany.com/stash/scm/PROJ/project.git, the plugin tries to match it against
https://jira.mycompany.com/stash/PROJ/project, which obviously fails.
As per official info from Atlassian, Bitbucket cannot be prevented from inserting the "/scm/" part in the path.
The corresponding code in the Bitbucket Jenkins plugin is in class com.cloudbees.jenkins.plugins.BitbucketPayloadProcessor:
private void processWebhookPayloadBitBucketServer(JSONObject payload) {
JSONObject repo = payload.getJSONObject("repository");
String user = payload.getJSONObject("actor").getString("username");
String url = "";
if (repo.getJSONObject("links").getJSONArray("self").size() != 0) {
try {
URL pushHref = new URL(repo.getJSONObject("links").getJSONArray("self").getJSONObject(0).getString("href"));
url = pushHref.toString().replaceFirst(new String("projects.*"), new String(repo.getString("fullName").toLowerCase()));
String scm = repo.has("scmId") ? repo.getString("scmId") : "git";
probe.triggerMatchingJobs(user, url, scm, payload.toString());
} catch (MalformedURLException e) {
LOGGER.log(Level.WARNING, String.format("URL %s is malformed", url), e);
}
}
}
In the JSON payload that Bitbucket sends to the plugin, the actual checkout URL doesn't appear, only the link to the repository's Bitbucket page. The above method from the plugin appears to construct the checkout URL from that URL by removing everything after and including projects/ and adding the "full name" of the repo, resulting in the above wrong URL.
Official info from Atlassian is that Bitbucket cannot be prevented from adding the "scm" part to the checkout URL.
Is this a bug in the Jenkins plugin? If so, how can the plugin work for anyone?
I found the reason for the failure.
The issue is that the Bitbucket plugin for Jenkins does account for the /scm part in the path, but only if it's the first part after the host name.
If your Bitbucket server instance is configured not under its own domain but under a path of another service, matching the checkout URLs will fail.
Example:
https://bitbucket.foobar.com/scm/PROJ/myproject.git will work,
https://jira.foobar.com/stash/scm/PROJ/myproject.git will not work.
Someone who also had this problem has already created a fix for the plugin, the pull request for which is pending: JENKINS-49177: Now removing first occurrence of /scm

TFS Rest API - Passing parameters to Xaml Builds

We are looking to queue XAML builds problematically and using TFS Rest API for that.
To queue a build, we are posting to the Url,
https://{tfsinstance}/DefaultCollection/{teamproject}/_apis/build/builds?api-version=2.0
as explained in the documentation https://www.visualstudio.com/en-us/docs/integrate/api/build/builds#queue-a-build
The body of the request looks like
{
"definition": { "id":7556 },
"parameters": "{\"ExistingBuildNumber\":\"ReachClient.2146.8\"}"
}
For XAML builds, The build does gets triggered with our call but the parameters we are passing are not passed to the queued build. Parameters gets passed to the new style TFS builds fine.
Any ideas on how to resolve this? The TFS Rest API doesn't mention a different mechanism for passing parameters to Xaml build?
We are using on-premise TFS server (TFS 2017 Update 2).
The legacy XAML build system is different with vNext build system. You couldn't pass all arguments using Rest API. If you want to use Build definition id, reason and priority,you could use the old REST API call.
However for XAML build you could pass variable to build in the build definition directly when queue the build.
As a workaround, either use tfsbuild.exe (to queue XAML builds) you can pass an additional argument like this:
&$tfsBuild.Exe start "url" project definition /requestedFor:"$buildRequestedFor" /queue
Or use TFS SDK IBuildRequest.ProcessParameters which gets an XML formatted string representing all the process parameters for this build. A code snippet:
var buildClient = new BuildHttpClient(new Uri(collectionURL), new
VssCredentials(true));
var res = await buildClient.QueueBuildAsync(new Build
{
Definition = new DefinitionReference
{
Id = targetBuild.Id
},
Project = targetBuild.Project,
SourceVersion = ChangeSetNumber,
Parameters = buildArg
});
return res.Id.ToString();
More details for using TFS SDK, please refer this blog.

Xcode Server Integration - Access Post Object from Before-Integration Trigger

I'm using Xcode server 5 for Continuous Integration. I created a Bot that I can trigger via a POST request to: https://server.mycompany.com:20343/api/bots/_some_bot_id_/integrations the Body of this POST request looks like { myVariable: "hello", buildBranch: "feature1" }. Is there a way I can access "myVariable" or the "buildBranch" variable values from within any of the Trigger Scripts that I have added Before and After Integration?

Jenkins - Next Execution plugin - How to get details through REST API

In my Jenkins i have installed a new plugin to see next executions details
Plugin: https://wiki.jenkins-ci.org/display/JENKINS/Next+Executions
I can see that in Jenkins dashboard successfully but how can i access its details through REST API, like the way we do for all other stuff in Jenkins.
I am using Java to access Jenkins via REST API.
Thanks
UPDATED in 2016.9.20 the REST API is supported from release 1.0.12
<jenkinsurl>/view/<viewname>/widgets/<n>/api/json?pretty=true
see detail for the ticket JENKINS-36210
Below is left for reference
Though the REST API doesn't exist, I share the html parse python code sample for reference
It use the internal lxml code to parse and generate the list of the data, key code segment here
html = urllib2.urlopen(url, context=CTX).read()
# use beautifulSoup4 instead of lxml is better, but it is not default
html2 = lxml.html.fromstring(html)
div = html2.get_element_by_id("next-exec") # key id !!
result = lxml.html.tostring(div)
tree = lxml.html.fromstring(result) # ugly, but it works
trs = tree.xpath('/html/body/div/div/table/tr')
for tr in trs:
tds = tr.xpath("td")
url = tds[0].xpath("a/#href")[0]
jobname = tds[0].text_content()
datetime = tds[1].text_content()
status.append((datetime, jobname, url))
return status
see detail in https://gist.github.com/larrycai/6828c959f57105ca93239ca6aa4fc6fa

Resources