Can Jenkins detect when a new build is available on a Bamboo server? - jenkins

Can Jenkins detect when a new build is available on a Bamboo server?
What I want is to create a Jenkins job that checks a Bamboo server for a new build. I want this job to run once per hour.
Then, other tests that I have on that Jenkins server will rely on that check passing in order for them to kick off.
If this is possible, what is the usual way of doing this? The Bamboo server is internal and does not need authentication to see status of builds or get build resources.
If there is no plugin for this, I do see a RSS feed at this URI: /rss/createAllBuildsRssFeed.action?feedType=rssAll&buildKey=RELEASE . What method would other Jenkins administrators use to read this feed?

I figured out the answer myself. I wrote a Gradle unit test to run in Jenkins that can read the RSS feed in Bamboo.
The real way to do it though, which didn't answer my question, is to add a post-build hook to either Subverison or Bamboo to send a HTTP get request to Jenkins, which notifies a job to run.

Related

Post commit deliver hook for RTC

I have a working Jenkins setup that can pull source code from RTC (Jazz server) and build etc. I can run this Jenkins job on demand or schedule to run at a random interval. However, now I am exploring as to how I can run this job only on detecting a change in the repository (e.g. new change set is delivered). I dont want to use the polling mechanism that Jenkins provide and I want RTC's post commit process to call my Jenkins job remotely.
Please can anyone guide me how I do it? Thank you.,

Can Jenkins used with python project?

I'm developing a web application using python django. I want a CI service which can automatically pull the latest code from my github and run some test then deploy. I'm not familiar with CI, after searching for a while I found Jenkins seems to be a good solution. Can Jenkins be used for this?
Jenkins can be used with any project.
Regarding pulling the latest code, add the Jenkins GitHub plugin in order to be able to check "Build when a change is pushed to GitHub" under "Build Triggers".
That will launch your job on any new pushed commit on the GitHub repo.
From there, a Jenkins job can execute any command that you would do in command-line, provided the agent on which said job will be scheduled and executed has the necessary tools in its PATH (here python)
An alternative (which does not involved Jenkins) is to setup a webhook and a listener on your server which will detect a "push event" sent by said webhook.

Accessing the BitBucket webhook's payload in Jenkins jobs

I'm using webhooks on Bitbucket to trigger builds on Jenkins when push event occurs, for this purpose I'm using Bitbucket plugin.
My Jenkins pipeline consist of multiple cross depending tasks e.g.:
Main pipeline (triggered task)
1) build docker images
2) run tests
3) do something
The build is triggered when expected but tasks are failing because they rely on specific branch that I need to provide. Unfortunately I don't know how to access the webhook's payload that have all the information I need.
The alternative would be using Poll CMS option in Jenkins but I prefer to build on demand and not periodically.
From:
https://wiki.jenkins-ci.org/display/JENKINS/BitBucket+Plugin
they say:
Since 1.1.5 Bitbucket automatically injects the payload received by Bitbucket into the build. You can catch the payload to process it accordingly through the environmental variable $BITBUCKET_PAYLOAD.
Regards

run a Jenkins job on another Jenkins instance from the Jenkins job

I want to create a Jenkins job that starts other Jenkins jobs. That would be quite easy, because Jenkins Template Project Plugin allows us to create a build step of a type "use builders from another project". However, what makes my situation harder is that I have to start Jenkins jobs on other machines. Is there any standard way to do that?
In case you want only to trigger new build of Job You Have plenty of ways to accomplish it
you can use remote access API and Trigger a request to build target job from source Job.
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Or you can use https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Remote+Trigger+Plugin
which is handy in handling server details and other stuff. you shoukld ensure ssh keys shared by both servers.

How to get URL of pipeline job in jenkins

We are setting up a continuous delivery pipeline in Jenkins, using the build pipeline plugin.
Our deployment steps uses a proprietary deploy tool (triggered by a HTTP request from jenkins), but we need to have an additional Jenkins step for acceptance tests on the then deployed project. So our deploy tool will need to trigger the last pipeline step.
The jenkins setup for this is obvious:
For a Manually Triggered downstream build step: To add a build step
that will wait for a manual trigger:
Select the Build Pipeline Plugin, Manually Execute Downstream Project check-box
Enter the name(s) of the downstream projects in the Downstream
Project Names field. (n.b. Multiple projects can be specified by using comma, like "abc, def".)
Source: Build Pipeline Plugin
The problem is: I can't seem to find a way to trigger this downstream build through a URL.
In fact I'd need the URL in the deploy job, so I can send it to the deploy tool as a callback URL. Can anybody help?
If I understand correctly, you want to use remote access API, which to my knowledge is no different between general project or pipeline one.
Take a look here:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Submitting jobs
Jobs without parameters
You merely need to perform an HTTP POST on JENKINS_URL/job/JOBNAME/build?token=TOKEN where TOKEN is set up in the job configuration.
As stated above by #rafal S do
read a file which has list projects name for which build job has to be triggered do a curl HTTP POST on JENKINS_URL/job/${JOBNAME from the file}/build?token=TOKEN within a for loop , where for loop has list of all project names from the file you read

Resources