Is it possible to set the a variable group scope using DevOps CLI or via REST - azure-devops-rest-api

I am able to add/modify DevOps release definitions through a combination of CLI and CLI REST methods. The release definition object does not include (as far as I can tell) a property that controls the variable group scope. The release definition itself takes an array of variable group ID's but there is also the scope of the variable group within the context of the release definition. Where is that?
Is there support to access the variable group scope property in the CLI or CLI REST interface? The image below shows the interface from the portal in azure. Selecting the ellipses (...) you can "change scope" where a list of stages is displayed. You than save and then save the release definition.
I captured fiddler output but the body post was huge and not very helpful, I didn't see anything related to a list of scopes. but obviously this can be done. I'm just not sure about doing so via CLI or REST.
Edit: Here is a view of the script. There is no "scope", which should contain a list of environment names, anywhere in the release definition that I can see. Each environment name (aka stage) contains a number of variable groups associated with each environment.
$sourcedefinition = getreleasedefinitionrequestwithpat $reldefid $personalAccesstoken $org $project | select -Last 1
Write-Host "Root VariableGroups: " $sourcedefinition.variableGroups
$result = #()
#search each stage in the pipeline
foreach($item in $sourcedefinition.environments)
{
Write-Host ""
Write-Host "environment name: "$item.name
Write-Host "environment variable groups: "$item.variableGroups
}
To help clarify, the scope I seek cannot be in the environments collection as this is specific to each element (stage). The scope is set at the release definition level for a given variable group (again refer to the image).

I use this API to get the Definitions of my release and find that the values of variableGroups in ReleaseDefinition and in ReleaseDefinitionEnvironment are different when the scopes are different.
Then I think if we want to change the scope via REST API, we just need to change the variableGroups and update the Definitions. We can use this API to update the Definitions.
Edit:
For example, I want to change my scope from Release to Stage, I use the API like below:
PUT https://vsrm.dev.azure.com/{organization}/{project}/_apis/release/definitions?api-version=6.1-preview.4
Request Body: (I get this from the first Get Definitions API Response Body and make some changes to use it)
{
"source":"userInterface",
"revision":6,
...
"lastRelease": {
"id": 1,
...
},
...
"variables":{
},
"variableGroups":[],
"environments":[
{
"name": "Stage 1",
...
"variables":{
},
"variableGroups":[
4
],
...
}
],
...
}
Note:
Please use your own newer revision.
The id value in lastRelease is your release definitionId.
Specify the stage name in environments name.
The variableGroups value in environments is the id of the variable group you want to change scope.

Related

Jenkins- Not getting desired options while using Active Choice and Active Choicec Reactive Parameter

We have requirement to select Environment from Jenkins UI to run the feature files.
Different Environment Options: QA, UAT, PROD
On the basis of the Environment selection from drop-down, all the available tenants i.e. different servers associated with that Environment should be visible.
To achieve above purpose I've used "Active Choices Parameter" for Environment.
Name: "Environment"
Groovy Script: return ['QA','UAT']
Fallback Script: return['error']
Choice Type: Single Select
To select the server or tenant on the basis of Environment selection, I've used "Active Choices Reactive Parameter"
Name: Tenants
Groovy Script:
return ['http://node-1.nginx.portal.daa-1.can.qa.aws.abc.net/login':'CAN','http://node-1.nginx.portal.daa-1.wan.qa.aws.abc.net/login':'WAN']
} else if (Environment.equals("UAT")) {
return ['https://can.uat.daa.app/login':'CANUAT','https://blic.uat.daa.app/login':'BLIC']
}
else if (Environment.equals("PROD")){
return ['http://node-1.nginx.portal.daa-1.can.qa.aws.abc.net/login':'CANPROD','http://node-1.nginx.portal.daa-1.blic.qa.aws.abc.net/login':'BLIC']
} else {
return ["Unknown"]
}
Fallback Script: return['error']
Choice Type: Single Select
After applying and saving this configuration, I'm getting ERROR in the Tenants drop-down.
Seems making minor mistake but unable to catch it.
Getting ERROR in the drop-down of Tenant Option
Environment should be prefaced with a '$' (as in $Environment) in the groovy script to signify it is a variable.
And did you specify the 'Referenced parameters' as 'Environment' in the Active Choices Reactive Parameter settings?:
referencedParameters: 'Environment',
You can test out your groovy script in the script console at http://your-jenkins-server/script

Using global shared libraries in Jenkins to define parameter options

I am trying to use a global class that I've defined in a shared library to help organise job parameters. It's not working, and I'm not even sure if it is possible.
My job looks something like this:
pipelineJob('My-Job') {
definition {
// Job definition goes here
}
parameters {
choiceParam('awsAccount', awsAccount.ALL)
}
}
In a file in /vars/awsAccount.groovy I have the following code:
class awsAccount implements Serializable {
final String SANDPIT = "sandpit",
final String DEV = "dev",
final String PROD = "prod"
static String[] ALL = [SANDPIT, DEV, PROD]
}
Global pipeline libraries are configured to load implicitly from the my repository's master branch.
When attempting to update the DSL scripts I receive the error:
ERROR: (myJob.groovy, line 67) No such property: awsAccount for class: javaposse.jobdsl.dsl.helpers.BuildParametersContext
Why does it not find the class, and is it even possible to use shared library classes like this in pipeline job?
Disclaimer: I know it works using Jenkinsfile. Unfortunatelly, not tested usng Declarative Pipelines - but no answers yet, so it may be worth a try
Regarding your first question: there are some reasons why a class from your shared-lib could not be found. Starting from the library import, the library syntax, etc. But they definitvely work for DSL. To be more precise about it, additional information would be great. But be sure that:
You have your groovy class definition using exactly the directory structure as described in the documentation (https://www.jenkins.io/doc/book/pipeline/shared-libraries/)
Give a name to the shared-lib in jenkins as you configure it and be sure is exactly the name you use in the import
Use the import as described in the documentation (under Using Libraries)
Regarding your second question (the one that names this SO question): yes, you can include parameter jobs from information in your shared-lib. At least, using Jenkinsfiles. You can even define properties to be included in the pipelie. I got it working with a tricky syntax due to different problems.
Again, I am using Jenkinsfile and this is what worked for me:
In my shared-lib class, I added a static function that introduces the build parameters. Notice the input parameters that function needs and its usage:
class awsAccount implements Serializable {
//
static giveMeParameters (script) {
return [
// Some parms
script.string(defaultValue: '', description: 'A default parameter', name: 'textParm'),
script.booleanParam(defaultValue: false, description: 'If set to True, do whatever you need - otherwise, do not do it', name: 'boolOption'),
]
}
}
To introduce those parameters in the pipeline, you need to place the returned value of the function into the parameters array
properties (
parameters (
awsAccount.giveMeParameters (this)
)
Again, notice the syntax when calling the function. Similar to this, you can also define functions in the shared-lib that return properties and use them in multiple jobs (disableConcurrentBuilds, buildDiscarder, etc)

How to pass Value inserted in text box of Active choice reactive parameter to job

When I'm in "build with parameters" before starting job, I want to pass the value entered in a textbox to the Job. I´m using Active choice and Active choice reactive parameter like this:
This is the groovy script which I then use to run job and show output. But I´m getting NULL on echo command.
node {
def commit = params.val
stage ('Pulling code from Bitbucket') {
git branch: 'master',
credentialsId: '2bbc73c4-254e-45bd-85f4-6a169699310c',
url: 'git#bitbucket.org:repo/test.git'
sh (""" echo ${commit}""")
}
}
Which is the correct way to pass parameter into build ?
From your output, you have defined a parameter named ID1 that references some other parameter named OPTIONS. The correct way to reference these parameters is params.ID1 and params.OPTIONS. I can't see a parameter named val that can be addressed by params.val.

Using Key Value Pair in Jenkins

This is my requirement to have parameters in Jenkins:
1. User selects 3 Values from Dropdown: DEV, QA, PROD
2. Upon selection I need to return single value as parameter as like this:
If DEV selected, return "Development http://dev.com 1"
If QA selected, return "QA http://qa.com 2"
If PROD selected, return "Production http://prod.com 3"
3. Once the value is returned in a variable, I will use that variable value in next step of 'Windows batch command'.
Where and How can define Key/Values. I tried to use Extended Choice Parameter plugin, but not sure how to do this.
I have managed to get the keys/values with a dropdown select parameter working with the Active Choices Plugin, it's not as complicated as the other answer here, and it actually buried in the comments on the plugin page itself.
To get a key/value pair select dropdown list parameter in Jenkins (i.e. show a human readable value, but pass a different key to the build. You simply need to use a map rather than a list when writing your groovy script. The map key is what the parameter will be set to if the user selects this option. The map value is what will be actually displayed to the user in the dropdown list.
For example the script: return ['Key1':'Display 1', 'Key2':'Display 2', 'Key3':'Display 3'] will display a dropdown containing: Display1, Display2 and Display3 to the user. However the build parameter will actually be set to Key1, Key2 or Key3 depending on what is selected.
For this specific question, here are the steps.
Ensure you have the Active Choices Plugin installed.
Open the configuration of your Jenkins job, select This project is parameterised.
Click Add Parameter and select Active Choices Parameter.
Name your parameter and click the Groovy Script check box.
In Groovy Script enter content: return ['Development http://dev.com 1':'DEV', 'QA http://qa.com 2':'QA', 'Production http://prod.com 3':'PROD'] For this example the user will see a dropdown with 3 options: DEV, QA, and PROD. The value passed for this parameter will be Development http://dev.com 1 etc. Now having a parameter with space and URLs may cause issue depending on how you use it later on in the build, but the concept is really what I'm trying to illustrate.
This can be achieved using the Active choice plugin, Find below the link and the image for your reference
Plugin reference: https://wiki.jenkins-ci.org/display/JENKINS/Active+Choices+Plugin
Another method without any plugins
Using shell script this can be achieved
Add a build step as shell script and add the below script that will return your values. Lets say the dropdown paramater name is "env"
if [ $env == "DEV" ]
then
url = "Development http://dev.com 1"
elif [ $env == "QA" ]
then
url = "QA http://qa.com 2"
elif [ $env == "PROD" ]
then
url = "Production http://prod.com 3"
fi
The $url variable will be having the expected value that can be used in your next build steps
Shell Script Reference: http://www.tutorialspoint.com/unix/if-elif-statement.htm
You can do the mapping in a groovy script. If you have a parameter named InputParam, you can map it to a new parameter called OutParam in a System Groovy Script like so:
import hudson.model.*
def parameterMap=[:]
parameterMap.put('DEV','Development http://dev.com 1')
parameterMap.put('QA','QA http://qa.com 2')
parameterMap.put('PROD','Production http://prod.com 3')
def buildMap = build.getBuildVariables()
def inputValue=buildMap['InputParam']
buildMap['OutParam']=parameterMap[inputValue]
setBuildParameters(buildMap)
def setBuildParameters(map) {
def npl = new ArrayList<StringParameterValue>()
for (e in map) {
npl.add(new StringParameterValue(e.key.toString(), e.value.toString()))
}
def newPa = null
def oldPa = build.getAction(ParametersAction.class)
if (oldPa != null) {
build.actions.remove(oldPa)
newPa = oldPa.createUpdated(npl)
} else {
newPa = new ParametersAction(npl)
}
build.actions.add(newPa)
}
Choose Execute system Groovy script as the first build action. You can then access the output param as an environmental variable in the windows shell, eg.
ECHO %OUTPARAM%

From Jenkins, how do I get a list of the currently running jobs in JSON?

I can find out just about everything about my Jenkins server via the Remote API, but not the list of currently running jobs.
This,
http://my-jenkins/computer/api/json
or
http://my-jenkins/computer/(master)/api/json
Would seem like the most logical choices, but they say nothing (other than the count of jobs) about which jobs are actually running.
There is often confusion between jobs and builds in Jenkins, especially since jobs are often referred to as 'build jobs'.
Jobs (or 'build jobs' or 'projects') contain configuration that describes what to run and how to run it.
Builds are executions of a job. A build contains information about the start and end time, the status, logging, etc.
See https://wiki.jenkins-ci.org/display/JENKINS/Building+a+software+project for more information.
If you want the jobs that are currently building (i.e. have one or more running builds), the fastest way is to use the REST API with XPath to filter on colors that end with _anime, like this:
http://jenkins.example.com/api/xml?tree=jobs[name,url,color]&xpath=/hudson/job[ends-with(color/text(),%22_anime%22)]&wrapper=jobs
will give you something like:
<jobs>
<job>
<name>PRE_DB</name>
<url>http://jenkins.example.com/job/my_first_job/</url>
<color>blue_anime</color>
</job>
<job>
<name>SDD_Seller_Dashboard</name>
<url>http://jenkins.example.com/job/my_second_job/</url>
<color>blue_anime</color>
</job>
</jobs>
Jenkins uses the color field to indicate the status of the job, where the _anime suffix indicates that the job is currently building.
Unfortunately, this won't give you any information on the actual running build. Multiple instances of the job maybe running at the same time, and the running build is not always the last one started.
If you want to list all the running builds, you can also use the REST API to get a fast answer, like this:
http://jenkins.example.com/computer/api/xml?tree=computer[executors[currentExecutable[url]],oneOffExecutors[currentExecutable[url]]]&xpath=//url&wrapper=builds
Will give you something like:
<builds>
<url>http://jenkins.example.com/job/my_first_job/1412/</url>
<url>http://jenkins.example.com/job/my_first_job/1414/</url>
<url>http://jenkins.example.com/job/my_second_job/13126/</url>
</builds>
Here you see a list off all the currently running builds. You will need to parse the URL to separate the job name from the build number. Notice how my_first_job has two builds that are currently running.
I have a view defined using View Job Filters Plugin that filters just currently running jobs, then you can use /api/json on the view page to see just the jobs that are running. I also have one for aborted, unstable, etc.
UPDATE
Select Edit View → Job Filters → Add Job Filter ▼ → Build Statuses Filter
Build Statuses: ☑ Currently Building
Match Type: Exclude Unmatched - ...
Bit of a hack but I think you can infer what jobs are currently running by looking at the color key in the job objects when you do a GET at /jenkins/api/json?pretty=true. If the 'ball' icon for a given job in Jenkins is animated, we know it's running.
Have a look at the array of job objects in the JSON response:
{
...
"jobs" : [
{
"name" : "Test Job 1",
"url" : "http://localhost:8000/jenkins/job/Test%20Job%201/",
"color" : "blue"
},
{
"name" : "Test Job 2",
"url" : "http://localhost:8000/jenkins/job/Test%20Job%202/",
"color" : "blue_anime"
}
...
}
In this case "color" : "blue_anime" indicates that the job is currently running, and "color" : "blue" indicates that the job is not running.
Hope this helps.
Marshal the output and filter for "building: true" from the following call to json api on a job with tree to filter out the extraneous stuff (hope this helps):
http://jenkins.<myCompany>.com/job/<myJob>/api/json?pretty=true&depth=2&tree=builds[builtOn,changeSet,duration,timestamp,id,building,actions[causes[userId]]]
will give you something like:
{
"builds" : [
{
"actions" : [
{
},
{
"causes" : [
{
"userId" : "cheeseinvert"
}
]
},
{
},
{
},
{
},
{
}
],
"building" : true,
"duration" : 0,
"id" : "2013-05-07_13-20-49",
"timestamp" : 1367958049745,
"builtOn" : "serverA",
"changeSet" : {
}
}, ...
You can do this with the jenkins tree api, using an endpoint like this:
http://<host>/api/json?tree=jobs[name,lastBuild[building,timestamp]]
You can see what attributes from lastBuild you can use if you access <job-endpoint>/lastBuild/api/json.
I had a similar problem where some pipeline builds get stuck in the building state after I restart jenkins (piepline jobs are supposed to be durable and resume but most of the time they get stuck indefinitely).
These builds do not use an executor so the only way to find them is to open every job.
All of the other answers seem to work when the project is considered building, i.e.: the last build is building. But they ignore past builds still building.
The following query works for me and gives me all the currently running builds, i.e.: they do not have a result.
http://localhost:8080/api/xml?tree=jobs[name,builds[fullDisplayName,id,number,timestamp,duration,result]]&xpath=/hudson/job/build[count(result)=0]&wrapper=builds
Nothing worked me properly. I copied and modified code form python-jenkins. Since Master node name changed , it was giving exception. Did'nt want to rely on plugin.
def get_running_builds():
builds = []
nodes = server.get_nodes()
for node in nodes:
# the name returned is not the name to lookup when
# dealing with master :/
if node['name'] == 'Built-In Node':
continue
if node['name'] == 'master':
node_name = '(master)'
else:
node_name = node['name']
try:
info = server.get_node_info(node_name, depth=2)
except server.JenkinsException as e:
# Jenkins may 500 on depth >0. If the node info comes back
# at depth 0 treat it as a node not running any jobs.
if ('[500]' in str(e) and
server.get_node_info(node_name, depth=0)):
continue
else:
raise
for executor in info['executors']:
executable = executor['currentExecutable']
if executable and 'number' in executable:
#print(f'{executable}')
executor_number = executor['number']
build_number = executable['number']
url = executable['url']
m = re.search(r'/job/([^/]+)/.*', urlparse(url).path)
job_name = m.group(1)
builds.append({'name': executable['fullDisplayName'],
'number': build_number,
'url': url,
'node': node_name,
'executor': executor_number,
'timestamp': executable['timestamp']})
return builds
timestamp gives time in millisecs.

Resources