I have a job that uses the Copy Artifacts Plugin to upload a .ipa file to TestFlight. I would like to only run this job by hand, not trigger it automatically. The job is configured with a build-selector parameter so that I can start the upload from one of a handful of similar jobs.
Is there an easy way (possibly with a plugin or script) to get the URL to the specific job that provided the artifact being uploaded?
Essentially I want to take the $BUILD_URL value from the upstream job so that I can include it in the TestFlight build notes.
(I am not sure if it directly pertains to what I want, but Get Jenkins upstream jobs seems to suggest that Groovy scripting might be the way to go. I also found a post on the Jenkins forums, http://jenkins-ci.361315.n4.nabble.com/Getting-upstream-job-s-build-number-td3167291.html that looked promising, but does not seem to apply to my scenario of manually triggered builds.)
Unless you are "Triggering Parameterized build..." a downstream job, in which case you could pass along Predefined Parameters "UPSTREAM_BUILD_URL=$BUILD_URL", I think you would have to store the BUILD_URL with the artifacts.
Related
I have a job that is triggered as a post-build action for dozens of other jobs. It essentially organizes and process the artifacts of those upstream jobs (using Copy Artifact Plugin), and publishes the reformatted logs, and originals, as artifacts of its own.
I want the build summary pages of an upstream job to have a link to that downstream job. From what I gather, this is not an intended use case. Conventional wisdom seems to be that if we want a link to a downstream job, we should run it as a sub-project within the Build step of the upstream job. But if I do that, I don't have the artifacts to pass to the downstream job. Catch 22.
Or is there something (even something really hacky and nasty) I can do to make this work. People want to get the processed artifacts directly from the build page.
One way (and I think the only way) to do this would be to call the Jenkins api from the downstream job to put a link to itself in the upstream job's description. But this seemed like more work than it was worth. So we just didn't do anything, and we're all fine.
does anyone know if its possible to add a Jenkins pipeline build into a Jenkins docker image? For example, I may have a Jenkinsfile that defines my pipeline in groovie, and would like to ADD that into my image when building from the Jenkins image.
something like:
FROM jenkins:latest
ADD ./jobs/Jenkinsfile-pipeline-example $JENKINS_HOME/${someplace}
And have that pipeline ready to go when i run it.
Thanks.
It's a lot cleaner to use Jenkinsfile for this instead. This way, as your repositories develop you can change the build process without needing to recompile and redeploy your Jenkins instance everytime. (less work, and less CI downtime) Also, having the Jenkinsfile in source code allows a simpler decoupling.
If you have any questions about extending Jenkins on Docker further to handle building NodeJS, Ruby or something else I go into how to do all that in an article.
You can create any job in Jenkins by passing in an XML file that describes the job. See https://support.cloudbees.com/hc/en-us/articles/220857567-How-to-create-a-job-using-the-REST-API-and-cURL
The way I've done this is to manually create the job I want in Jenkins, then append config.xml to the URL and it shows you the XML content needed to generate the pipeline job. Save that XML and you can deliver it to your newly deployed Jenkins instance.
I use a system similar to this to generate several hundred jobs based on our external build specifications.
I want to create a Jenkins job that starts other Jenkins jobs. That would be quite easy, because Jenkins Template Project Plugin allows us to create a build step of a type "use builders from another project". However, what makes my situation harder is that I have to start Jenkins jobs on other machines. Is there any standard way to do that?
In case you want only to trigger new build of Job You Have plenty of ways to accomplish it
you can use remote access API and Trigger a request to build target job from source Job.
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Or you can use https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Remote+Trigger+Plugin
which is handy in handling server details and other stuff. you shoukld ensure ssh keys shared by both servers.
We are setting up a continuous delivery pipeline in Jenkins, using the build pipeline plugin.
Our deployment steps uses a proprietary deploy tool (triggered by a HTTP request from jenkins), but we need to have an additional Jenkins step for acceptance tests on the then deployed project. So our deploy tool will need to trigger the last pipeline step.
The jenkins setup for this is obvious:
For a Manually Triggered downstream build step: To add a build step
that will wait for a manual trigger:
Select the Build Pipeline Plugin, Manually Execute Downstream Project check-box
Enter the name(s) of the downstream projects in the Downstream
Project Names field. (n.b. Multiple projects can be specified by using comma, like "abc, def".)
Source: Build Pipeline Plugin
The problem is: I can't seem to find a way to trigger this downstream build through a URL.
In fact I'd need the URL in the deploy job, so I can send it to the deploy tool as a callback URL. Can anybody help?
If I understand correctly, you want to use remote access API, which to my knowledge is no different between general project or pipeline one.
Take a look here:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Submitting jobs
Jobs without parameters
You merely need to perform an HTTP POST on JENKINS_URL/job/JOBNAME/build?token=TOKEN where TOKEN is set up in the job configuration.
As stated above by #rafal S do
read a file which has list projects name for which build job has to be triggered do a curl HTTP POST on JENKINS_URL/job/${JOBNAME from the file}/build?token=TOKEN within a for loop , where for loop has list of all project names from the file you read
Can Jenkins detect when a new build is available on a Bamboo server?
What I want is to create a Jenkins job that checks a Bamboo server for a new build. I want this job to run once per hour.
Then, other tests that I have on that Jenkins server will rely on that check passing in order for them to kick off.
If this is possible, what is the usual way of doing this? The Bamboo server is internal and does not need authentication to see status of builds or get build resources.
If there is no plugin for this, I do see a RSS feed at this URI: /rss/createAllBuildsRssFeed.action?feedType=rssAll&buildKey=RELEASE . What method would other Jenkins administrators use to read this feed?
I figured out the answer myself. I wrote a Gradle unit test to run in Jenkins that can read the RSS feed in Bamboo.
The real way to do it though, which didn't answer my question, is to add a post-build hook to either Subverison or Bamboo to send a HTTP get request to Jenkins, which notifies a job to run.