Given a Jenkins build job with different promotion jobs (i.e., that promote builds to different environments), how can one trigger a specific promotion job for a specific build using the Jenkins API?
Combined answers from different sources to come up with this:
$Username = "Username"
$APItoken = '12345'
$Credential = "$($Username):$($APItoken)"
$EncodedCredential = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($Credential))
$BasicAuthValue = "Basic $EncodedCredential"
$Headers = #{
Authorization = $BasicAuthValue
}
Write-Output "Promoting build $LatestBuildNumber to Environment..."
Invoke-WebRequest -URI "http://jenkins.prd.company.com/job/jobname/buildnumber/promotion/forcePromotion?name=PromoteToEnvironment" -Headers $Headers
I know this is an old thread, but just to help the community.
Shell Solution using CURL:
user_name="jenkins_user"
user_token="token"
promotion_name="Test_Promote"
jenkins_url="http://build-server.com"
JOB_NAME="job_name"
JOB_NO="job-no"
url="--silent -u $user_name:$user_token $jenkins_url/job/$JOB_NAME/$JOB_NO/promotion/forcePromotion?name=$promotion_name"
curl $url
How to generate jenkins user token: https://jenkins.io/blog/2018/07/02/new-api-token-system/
Related
My code is in TFS repository but due to some reason few files are in Sharepoint/MS Teams, how can we clone code from both the sources in the build definition.
Get Sources task is the default which clones the specified TFS repository, is there a way to add or edit this task to clone code from Sharepoint at the same time.
You cannot edit the Get Sources task to clone code from sharepoint.
However, you can use a powershell task to download the files from the sharepoint.
For example, add a powershell task in your pipeline to run below inline scripts:
Using WebClient
$SharePointFile = "https://the.server/path/to/the/file.txt"
$Path = "$(Build.SourcesDirectory)\file.txt"
#User Information
$Username = "userName"
$Password = "password"
#Download Files
$client = New-Object System.Net.WebClient
$client.Credentials = New-Object System.Net.Networkcredential($UserName, $Password)
$client.DownloadFile($SharePoint, $Path)
$client.Dispose()
Using Invoke-WebRequest
$User = "userName"
$PWord = ConvertTo-SecureString -String "password" -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $PWord
$url = 'https://the.server/path/to/the/file.txt'
$outfile = "$(Build.SourcesDirectory)\file.txt"
Invoke-WebRequest -Uri $url -OutFile $outfile -Credential $Credential
Above script will download the file from your sharepoint server to the source code folder $(Build.SourcesDirectory) on the agent machine (ie. c:\agent_work\1\s)
You can also use SharePoint Pnp PowerShell Framework to download the files in powershell task. See example in this blog.
This question is continuation of my other question - How to schedule an on-premise Azure DevOps build to run every 5 minutes?
I cannot figure out how to script the schedule for a build. What API am I supposed to use?
EDIT 1
I want to emphasize - I do not want to queue the build myself every 5 minute. I want to script a schedule of the build. So, I am at the Definition Update REST Api - https://learn.microsoft.com/en-us/rest/api/azure/devops/build/definitions/update?view=azure-devops-rest-5.1 and still do not get it how to update the build definition's schedule. The advice to open Fiddler and reverse engineer the API makes me think this is not documented. Does it mean that whatever I implement based on the traffic analysis may be broken on the next release?
EDIT 2
Using the proposed solution works. Here is my code, based on the provided answer. I had to change 2 things:
The body should be a scalar object, not an array. So, I convert $BuildDefinition rather than #($BuildDefinition) to json.
I use Windows authentication, because we have an on-premise Azure DevOps server.
$BuildDefinition | Add-Member triggers $triggers -Force
$json = ConvertTo-Json $BuildDefinition -Depth 99
$Url = $BuildDefinition.url -replace '(.+)\?.+',"`$1?api-version=5.0"
Invoke-RestMethod -Uri $url -Method Put -Body $json -ContentType "application/json" -UseDefaultCredentials
However, the build definition object must be obtain through the GET API, not the LIST API. The latter returns a reduced version of the build definition, which cannot be used to update it.
EDIT 3
It is very important to specify the branch using the full notation, i.e. refs/heads/master instead of just master. Using the latter appears to work - the schedules are created, the branch filter looks correct, but it does not work. The problem is that the GUI does not give any indication something is wrong.
If you mean setting the build schedule using REST API, then you can use Definitions - Update
You can also press F12 in browser to track the API when set the schedule from the UI.
Back to your requirement:
How to schedule an on-premise Azure DevOps build to run every 5 minutes?
Just as you mentioned, currently On-premise Azure DevOps Server does not support schedules in the YAML. And the UI for defining time-based build triggers isn't flexible enough.
So, we cannot achieve that like the built-in feature.
However we can call the queue build REST API to queue the build every 5 minutes, we have two ways to do that:
Write a script to call the queue build REST API, then run it
periodically on a client machine, we can set it with Windows Task
Scheduler. Reference below blogs to do that:
How to schedule a Batch File to run automatically in Windows
Run a task every x-minutes with Windows Task Scheduler
Hard-coded in the script, open a console to run the script in any
client which can access the Azure DevOps Server (below PowerShell
script works for me):
Example:
Param(
[string]$collectionurl = "https://server/DefaultCollection",
[string]$projectName = "ProjectName",
[string]$BuildDefinitionId = "11",
[string]$user = "username",
[string]$token = "password/PAT"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
function CreateJsonBody
{
$value = #"
{
"definition": {
"id": $BuildDefinitionId
}
}
"#
return $value
}
$json = CreateJsonBody
$uri = "$($collectionurl)/$($projectName)/_apis/build/builds?api-version=5.1"
$EndTime = Get-Date
while($true) {
$EndTime = $EndTime.AddMinutes(5)
###Queue build###
$result = Invoke-RestMethod -Uri $uri -Method Post -Body $json -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
Start-Sleep -Seconds $( [int]( New-TimeSpan -End $EndTime ).TotalSeconds )
}
UPDATE1:
To update the build definition with the schedule trigger enabled, we need to append the trigger attributes in the request body.
GET build definition by calling the REST API, use the response as the request body.
Append the triggers attributes in the response request body:
"triggers": [
{
"schedules": [
{
"branchFilters": [
"+refs/heads/master"
],
"timeZoneId": "UTC",
"startHours": 5,
"startMinutes": 20,
"daysToBuild": 31,
"scheduleJobId": "5e8e3663-2d1c-482e-bb4d-91f804755010",
"scheduleOnlyWithChanges": true
}
],
"triggerType": "schedule"
}
]
UPDATE2:
Well, you can use below PowerShell script to enable/update the build schedule trigger by updating the build definition:
Param(
[string]$collectionurl = "https://server/DefaultCollection",
[string]$project = "projectname",
[string]$definitionid = "183",
[string]$user = "username",
[string]$token = "password/PAT"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
$ErrorActionPreference = 'SilentlyContinue'
#Get resonse of the build definition
$defurl = "$collectionurl/$project/_apis/build/definitions/$($definitionid)?api-version=5.1"
$definition = Invoke-RestMethod -Uri $defurl -Method Get -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
#Set trigger array
$triggers = '
[{
"schedules": [
{
"branchFilters": [
"+refs/heads/master"
],
"timeZoneId": "UTC",
"startHours": 9,
"startMinutes": 40,
"daysToBuild": 31,
"scheduleOnlyWithChanges": true
}
],
"triggerType": "schedule"
}]'
cls
#Add a trigger block to the response body
$definition | Add-Member -NotePropertyName "triggers" -NotePropertyValue (Convertfrom-Json $triggers) -Force
Remove-TypeData System.Array # Remove the redundant ETS-supplied .Count and values property
#Convert the response body to Json
$json = #($definition) | ConvertTo-Json -Depth 99
#Update build definition
$updatedef = Invoke-RestMethod -Uri $defurl -Method Put -Body $json -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
Write-Host ($updatedef.triggers | ConvertTo-Json -Depth 99)
I'm using the "Post to Slack" task as one of my build steps in TFS 2018 and I'm wondering how to access variables relating to that commit. I would like to include them as part of the Message field (something like "Commit: $(CommitMessage) link to changeset $(ChangesetLink)" but those variables don't exist). Here is where I need to reference the variables in TFS:
This document: link describes how to access build variables but it doesn't mention anything relating to the commit. I would like to access the commit message, associated commit changesets and the link to the changeset(s) associated with the commit. Does anyone know how to do this or know where I can find documentation for it? Thank you
Cruiser is right, no such Predefined variables in TFS, you can retrieve the needed information by REST API, then set corresponding variables using the Logging Commands.
Create a PowerShell script to set the avariables (Reference below
sample, you can also Use the OAuth token to access the REST API), then commit and push the script into TFS.
Add a PowerShell task before the "Post to Slack" task in your
definition to run the PS script
Use the variables $(commitID), $(CommitMessage) and
$(commitUrl) in "Post to Slack" task
Note: For Git it's commit, For TFVC it's changeset
You can use below script to set the variables:
Param(
[string]$collectionurl = "http://server:8080/tfs/DefaultCollection",
[string]$repoid = "389e8215-1fb2-4fdc-bd04-ebc8a8a4410e",
[string]$user = "username",
[string]$token = "password"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
$searchCriteria = "$" + "top=1"
$baseUrl = "$collectionurl/_apis/git/repositories/$repoid/commits?$searchCriteria"
$response = (Invoke-RestMethod -Uri $baseUrl -Method Get -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)})
#Retrieve values
$commitID = $response.value.commitID
$CommitMessage = $response.value.comment
$commitUrl = $response.value.remoteUrl
#Set variables
Write-Host "##vso[task.setvariable variable=commitID]$commitID"
Write-Host "##vso[task.setvariable variable=CommitMessage]$CommitMessage"
Write-Host "##vso[task.setvariable variable=commitUrl]$commitUrl"
UPDATE:
You can use this REST API to get the repository ID:
GET http://server:8080/tfs/DefaultCollection/{ProjectName}/_apis/git/repositories
There are 100s of builds left in our build definition indefinitely, regardless of the retention settings: i want to delete builds with scripts, i am trying run from remote PC. our tfs server is 2015.2.
tfsbuild destroy /collection:http://tfsserver:8080/tfs/ProjectCollection /dateRange:01/01/2017~31/12/2017 /buildDefinition:teamProject\Builddefintion
output shows: No builds found for build specification. even though there are many builds meets the criteria. any help is appreciated. Thanks!
Tfsbuild delete/destroy only availabe for Xaml builds. And need to delete first then destroy.
For vNext builds, you can try to delete them with the REST API (Delete a build):
DELETE http://server:8080/tfs/DefaultCollection/ProjectName/_apis/build/builds/{build Id}?api-version=2.0
You can use below PowerShell script to delete all the builds which compeleted in the year 2017 for the specific build definiiton:
Param(
[string]$collectionurl = "http://server:8080/tfs/Collection",
[string]$projectName = "ProjectName",
[string]$builddefinitionID = "56",
[string]$keepForever = "true",
[string]$user = "username",
[string]$token = "password"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
#Get builds list which completed in the year 2017
$buildsUrl = "$($collectionurl)/$projectName/_apis/build/builds?definitions=$builddefinitionID&statusFilter=completed&api-version=2.0"
$builds = (Invoke-RestMethod -Uri $buildsUrl -Method Get -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}).value | where({$_.finishTime -like '*2017*'})
#Delete the builds
foreach ($build in $builds.id)
{
$deleteurl = "$($collectionurl)/$projectName/_apis/build/builds/$build"+"?api-version=2.0"
$result = (Invoke-RestMethod -Uri $deleteurl -Method Delete -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)})
Write-Host "Builds deleted with the ID" : $build
}
I'm looking for a way to set the retain indefinitely field of a build when it completes.
Maybe using a PowerShell script as a build step?
For the benefit of searchers using vNext builds, you can use a Powershell script to do this.
You will need to 'Allow scripts to access OAuth token' in the buid options.
$buildId = $env:BUILD_BUILDID # the currently running build
"Build ID: $buildId"
$keepforever = #{
keepforever='true'
}
$jsonKeepForever = $keepforever | ConvertTo-Json -Depth 100
$uriForBuildUpdate = "$($env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI)$($env:SYSTEM_TEAMPROJECTID)/_apis/build/builds/" + $buildID + "?api-version=2.0"
$result = Invoke-RestMethod -Uri $uriForBuildUpdate -Method Patch -Body $jsonKeepForever -ContentType "application/json" -Headers #{ Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN" }
Write-Verbose "Result: $result" -Verbose
"********************************"
"Build set to retain indefinitely"
"********************************"
Check out the "Build Updating Tasks" extension. It contains a Build Retention task. It does exactly what you need. You do need to be on Update 3 of TFS if I'm not mistaking.