Is there a way I can identify the trigger for the current build during execution. What I want is to identify if the trigger was an SCM change, cron trigger or user trigger. I have multiple triggers defined for a job and want to use trigger type as a parameter in the shell execution script.
You can use the Rest API to get this info; here's an example:
http://jenkins.yourdomain.com/job/job_name/build_number/api/json?tree=actions[causes[shortDescription]]&pretty=true
returns
{
"actions" : [
{
"causes" : [
{
"shortDescription" : "Started by an SCM change"
}
]
},
{
},
{
},
{
},
{
},
{
},
{
}
]
}
One solution is to use the Run Condition Plugin which can run a different shell script depending on the trigger type. It is not the perfect solution, but it will do what you want.
You can also do it with groovy script. Check out my answer to Jenkins Groovy: What triggered the build
you can get the Cause object and then check for which subtype of cause it is
http://javadoc.jenkins-ci.org/hudson/model/Cause.html
At http(s)://(your-jenkins-server)/jenkins/job/(job-name)/(job-number) , under the "Build Artifacts" and "Changes" sections (if you have them), you should see this icon: . The text next to it should state what caused the build.
Related
I've been trying to construct multiple jobs from a list and everything seems to be working as expected. But as soon as I execute the first build (which works correctly) the parameters in the job disappears. This is how I've constructed the pipelineJob for the project.
import javaposse.jobdsl.dsl.DslFactory
def repositories = [
[
id : 'jenkins-test',
name : 'jenkins-test',
displayName: 'Jenkins Test',
repo : 'ssh://<JENKINS_BASE_URL>/<PROJECT_SLUG>/jenkins-test.git'
]
]
DslFactory dslFactory = this as DslFactory
repositories.each { repository ->
pipelineJob(repository.name) {
parameters {
stringParam("BRANCH", "master", "")
}
logRotator{
numToKeep(30)
}
authenticationToken('<TOKEN_MATCHES_WITH_THE_BITBUCKET_POST_RECEIVE_HOOK>')
displayName(repository.displayName)
description("Builds deploy pipelines for ${repository.displayName}")
definition {
cpsScm {
scm {
git {
branch('${BRANCH}')
remote {
url(repository.repo)
credentials('<CREDENTIAL_NAME>')
}
extensions {
localBranch('${BRANCH}')
wipeOutWorkspace()
cloneOptions {
noTags(false)
}
}
}
scriptPath('Jenkinsfile)
}
}
}
}
}
After running the above script, all the required jobs are created successfully. But then once I build any job, the parameters disappear.
After that when I run the seed job again, the job starts showing the parameter. I'm having a hard time figuring out where the problem is.
I've tried many things but nothing works. Would appreciate any help. Thanks.
This comment helped me to figure out similar issue with my .groovy file:
I called parameters property twice (one at the node start and then tried to set other parameters in if block), so the latter has overwritten the initial parameters.
BTW, as per the comments in the linked ticket, it is an issue with both scripted and declarative pipelines.
Fixed by providing all job parameters in each parameters call - for the case with ifs.
Though I don't see repeated calls in the code you've provided, please check the full groovy files for your jobs and add all parameters to all parameters {} blocks.
I have a multi-branch pipeline job set to build by Jenkinsfile every minute if new changes are available from the git repo. I have a step that deploys the artifact to an environment if the branch name is of a certain format. I would like to be able to configure the environment on a per-branch basis without having to edit Jenkinsfile every time I create a new such branch. Here is a rough sketch of my Jenkinsfile:
pipeline {
agent any
parameters {
string(description: "DB name", name: "dbName")
}
stages {
stage("Deploy") {
steps {
deployTo "${params.dbName}"
}
}
}
}
Is there a Jenkins plugin that will let me define a default value for the dbName parameter per branch in the job configuration page? Ideally something like the mock-up below:
The values should be able to be reordered to set priority. The plugin stops checking for matches after the first one. Matching can be exact or regex.
If there isn't such a plugin currently, please point me to the closest open-source one you can think of. I can use it as a basis for coding a custom plugin.
A possible plugin you could use as a starting point for a custom plugin is the Dynamic Parameter Plugin
Here is a workaround :
Using the Jenkins Config File Provider plugin create a config json with parameters defined in it per branch. Example:
{
"develop": {
"dbName": "test_db",
"param2": "value"
},
"master": {
"dbName": "prod_db",
"param2": "value1"
},
"test_branch_1": {
"dbName": "zss_db",
"param2": "value2"
},
"default": {
"dbName": "prod_db",
"param2": "value3"
}
}
In your Jenkinsfile:
final commit_data = checkout(scm)
BRANCH = commit_data['GIT_BRANCH']
configFileProvider([configFile(fileId: '{Your config file id}', variable: 'BRANCH_SETTINGS')]) {
def config = readJSON file:"$BRANCH_SETTINGS"
def branch_config = config."${BRANCH}"
if(branch_config){
echo "using config for branch ${BRANCH}"
}
else{
branch_config = config.default
}
echo branch_config.'dbName'
}
You can then use branch_config.'dbName', branch_config.'param2' etc. You can even set it to a global variable and then use throughout your pipeline.
The config file can easily be edited via the Jenkins UI(Provided by the plugin) to provision for new branches/params in the future. This doesn't need access to any non sandbox methods.
Not really an answer to your question, but possibly a workaround...
I don't know that the rest of your parameter list looks like, but if it is a static list, you could potentially have your static list with a "use Default" option as the first one.
When the job is run, if the value is "use Default", then gather the default from a file stored in the SCM branch and use that.
I need to add the next build time scheduled in a build email notification after a build in Jenkins.
The trigger can be "Build periodically" or "Poll SCM", or anything with schedule time.
I know the trigger info is in the config.xml file e.g.
<triggers>
<hudson.triggers.SCMTrigger>
<spec>8 */2 * * 1-5</spec>
<ignorePostCommitHooks>false</ignorePostCommitHooks>
</hudson.triggers.SCMTrigger>
</triggers>
and I also know how to get the trigger type and spec with custom scripting from the config.xml file, and calculate the next build time.
I wonder if Jenkins has the API to expose this information out-of-the-box. I have done the search, but not found anything.
I realise you probably no longer need help with this, but I just had to solve the same problem, so here is a script you can use in the Jenkins console to output all trigger configurations:
#!groovy
Jenkins.instance.getAllItems().each { it ->
if (!(it instanceof jenkins.triggers.SCMTriggerItem)) {
return
}
def itTrigger = (jenkins.triggers.SCMTriggerItem)it
def triggers = itTrigger.getSCMTrigger()
println("Job ${it.name}:")
triggers.each { t->
println("\t${t.getSpec()}")
println("\t${t.isIgnorePostCommitHooks()}")
}
}
This will output all your jobs that use SCM configuration, along with their specification (cron-like expression regarding when to run) and whether post-commit hooks are set to be ignored.
You can modify this script to get the data as JSON like this:
#!groovy
import groovy.json.*
def result = [:]
Jenkins.instance.getAllItems().each { it ->
if (!(it instanceof jenkins.triggers.SCMTriggerItem)) {
return
}
def itTrigger = (jenkins.triggers.SCMTriggerItem)it
def triggers = itTrigger.getSCMTrigger()
triggers.each { t->
def builder = new JsonBuilder()
result[it.name] = builder {
spec "${t.getSpec()}"
ignorePostCommitHooks "${t.isIgnorePostCommitHooks()}"
}
}
}
return new JsonBuilder(result).toPrettyString()
And then you can use the Jenkins Script Console web API to get this from an HTTP client.
For example, in curl, you can do this by saving your script as a text file and then running:
curl --data-urlencode "script=$(<./script.groovy)" <YOUR SERVER>/scriptText
If Jenkins is using basic authentication, you can supply that with the -u <USERNAME>:<PASSWORD> argument.
Ultimately, the request will result in something like this:
{
"Build Project 1": {
"spec": "H/30 * * * *",
"ignorePostCommitHooks": "false"
},
"Test Something": {
"spec": "#hourly",
"ignorePostCommitHooks": "false"
},
"Deploy ABC": {
"spec": "H/20 * * * *",
"ignorePostCommitHooks": "false"
}
}
You should be able to tailor these examples to fit your specific use case. It seems you won't need to access this remotely but just from a job, but I also included the remoting part as it might come in handy for someone else.
I'm using the Build Flow plugin to perform parallel builds. I need to pass a choice parameter (branch_name) from the parent job to the child jobs. I'm unsure how to get this working. The choice parameter has multiple branch names. How can I do this?
Here's an sample of the code,
// Here's where I set the variable for the choice parameter (branch_name)
branch_name = ${branch_name}
// Here's where I call the variable to pass to the other jobs
parallel (
{ build("build1", branch_name:params["branch_name"], },
{ build("build2", branch_name:params["branch_name"], },
{ build("build3", branch_name:params["branch_name"], },
{ build("build4", branch_name:params["branch_name"], },
)
What am I doing wrong? Please help. Thx.
Assume you got this working? For future Googlers, I believe it would be as follows:
// Get the parameter of the parent, flow job - should be available as environment variable
def branch = build.environment.get("PARENT_PARAM_NAME")
parallel (
// pass to child job
{ build("build1", CHILD_PARAM_NAME: branch)},
// repeat as required
)
I can find out just about everything about my Jenkins server via the Remote API, but not the list of currently running jobs.
This,
http://my-jenkins/computer/api/json
or
http://my-jenkins/computer/(master)/api/json
Would seem like the most logical choices, but they say nothing (other than the count of jobs) about which jobs are actually running.
There is often confusion between jobs and builds in Jenkins, especially since jobs are often referred to as 'build jobs'.
Jobs (or 'build jobs' or 'projects') contain configuration that describes what to run and how to run it.
Builds are executions of a job. A build contains information about the start and end time, the status, logging, etc.
See https://wiki.jenkins-ci.org/display/JENKINS/Building+a+software+project for more information.
If you want the jobs that are currently building (i.e. have one or more running builds), the fastest way is to use the REST API with XPath to filter on colors that end with _anime, like this:
http://jenkins.example.com/api/xml?tree=jobs[name,url,color]&xpath=/hudson/job[ends-with(color/text(),%22_anime%22)]&wrapper=jobs
will give you something like:
<jobs>
<job>
<name>PRE_DB</name>
<url>http://jenkins.example.com/job/my_first_job/</url>
<color>blue_anime</color>
</job>
<job>
<name>SDD_Seller_Dashboard</name>
<url>http://jenkins.example.com/job/my_second_job/</url>
<color>blue_anime</color>
</job>
</jobs>
Jenkins uses the color field to indicate the status of the job, where the _anime suffix indicates that the job is currently building.
Unfortunately, this won't give you any information on the actual running build. Multiple instances of the job maybe running at the same time, and the running build is not always the last one started.
If you want to list all the running builds, you can also use the REST API to get a fast answer, like this:
http://jenkins.example.com/computer/api/xml?tree=computer[executors[currentExecutable[url]],oneOffExecutors[currentExecutable[url]]]&xpath=//url&wrapper=builds
Will give you something like:
<builds>
<url>http://jenkins.example.com/job/my_first_job/1412/</url>
<url>http://jenkins.example.com/job/my_first_job/1414/</url>
<url>http://jenkins.example.com/job/my_second_job/13126/</url>
</builds>
Here you see a list off all the currently running builds. You will need to parse the URL to separate the job name from the build number. Notice how my_first_job has two builds that are currently running.
I have a view defined using View Job Filters Plugin that filters just currently running jobs, then you can use /api/json on the view page to see just the jobs that are running. I also have one for aborted, unstable, etc.
UPDATE
Select Edit View → Job Filters → Add Job Filter ▼ → Build Statuses Filter
Build Statuses: ☑ Currently Building
Match Type: Exclude Unmatched - ...
Bit of a hack but I think you can infer what jobs are currently running by looking at the color key in the job objects when you do a GET at /jenkins/api/json?pretty=true. If the 'ball' icon for a given job in Jenkins is animated, we know it's running.
Have a look at the array of job objects in the JSON response:
{
...
"jobs" : [
{
"name" : "Test Job 1",
"url" : "http://localhost:8000/jenkins/job/Test%20Job%201/",
"color" : "blue"
},
{
"name" : "Test Job 2",
"url" : "http://localhost:8000/jenkins/job/Test%20Job%202/",
"color" : "blue_anime"
}
...
}
In this case "color" : "blue_anime" indicates that the job is currently running, and "color" : "blue" indicates that the job is not running.
Hope this helps.
Marshal the output and filter for "building: true" from the following call to json api on a job with tree to filter out the extraneous stuff (hope this helps):
http://jenkins.<myCompany>.com/job/<myJob>/api/json?pretty=true&depth=2&tree=builds[builtOn,changeSet,duration,timestamp,id,building,actions[causes[userId]]]
will give you something like:
{
"builds" : [
{
"actions" : [
{
},
{
"causes" : [
{
"userId" : "cheeseinvert"
}
]
},
{
},
{
},
{
},
{
}
],
"building" : true,
"duration" : 0,
"id" : "2013-05-07_13-20-49",
"timestamp" : 1367958049745,
"builtOn" : "serverA",
"changeSet" : {
}
}, ...
You can do this with the jenkins tree api, using an endpoint like this:
http://<host>/api/json?tree=jobs[name,lastBuild[building,timestamp]]
You can see what attributes from lastBuild you can use if you access <job-endpoint>/lastBuild/api/json.
I had a similar problem where some pipeline builds get stuck in the building state after I restart jenkins (piepline jobs are supposed to be durable and resume but most of the time they get stuck indefinitely).
These builds do not use an executor so the only way to find them is to open every job.
All of the other answers seem to work when the project is considered building, i.e.: the last build is building. But they ignore past builds still building.
The following query works for me and gives me all the currently running builds, i.e.: they do not have a result.
http://localhost:8080/api/xml?tree=jobs[name,builds[fullDisplayName,id,number,timestamp,duration,result]]&xpath=/hudson/job/build[count(result)=0]&wrapper=builds
Nothing worked me properly. I copied and modified code form python-jenkins. Since Master node name changed , it was giving exception. Did'nt want to rely on plugin.
def get_running_builds():
builds = []
nodes = server.get_nodes()
for node in nodes:
# the name returned is not the name to lookup when
# dealing with master :/
if node['name'] == 'Built-In Node':
continue
if node['name'] == 'master':
node_name = '(master)'
else:
node_name = node['name']
try:
info = server.get_node_info(node_name, depth=2)
except server.JenkinsException as e:
# Jenkins may 500 on depth >0. If the node info comes back
# at depth 0 treat it as a node not running any jobs.
if ('[500]' in str(e) and
server.get_node_info(node_name, depth=0)):
continue
else:
raise
for executor in info['executors']:
executable = executor['currentExecutable']
if executable and 'number' in executable:
#print(f'{executable}')
executor_number = executor['number']
build_number = executable['number']
url = executable['url']
m = re.search(r'/job/([^/]+)/.*', urlparse(url).path)
job_name = m.group(1)
builds.append({'name': executable['fullDisplayName'],
'number': build_number,
'url': url,
'node': node_name,
'executor': executor_number,
'timestamp': executable['timestamp']})
return builds
timestamp gives time in millisecs.