I'm using cakebuid as my build tool for TFS 2017 Update 2 and trying to implement the traditional Git Flow. In this flow, there are a few automatic merges that happen every time changes get into master, those changes need to be propagated to the develop branch.
Using cake I can run a PowerShell script or use LibGit2Sharp to accomplish the automatic merge for the best case scenarios. But, what about when the merge has conflicts? Do I need to fail the whole build because the merge process fail?
We have certainly something to deal with merges in TFS, this is no other than the Pull Request.
Question
Is there any tool or add-in for cake that allows me to create
Pull Request during the execution of a build step?
I don't think there is any add-in available for you to create a pull request but since you can run PowerShell, you can easily use the TFS rest api to create pull request
https://www.visualstudio.com/en-us/docs/integrate/api/git/pull-requests/pull-requests
It was recently announced that there is a VSTS CLI:
https://blogs.msdn.microsoft.com/devops/2017/11/15/introducing-the-new-cli-for-vsts/
Which includes the ability to create a pull request:
https://learn.microsoft.com/en-gb/cli/vsts/get-started?view=vsts-cli-latest#create-a-pull-request
I don't think it would be particularly hard to create a Cake Addin which wraps this tool, and exposes the functionality through a set of addin's.
In the mean time, you could shell out to this tool using the Process aliases that currently exist in Cake.
Finally, I spend sometimes creating the package:
Nuget: https://www.nuget.org/packages/Cake.Tfs.AutoMerge
GitHub: https://github.com/mabreuortega/Cake.Tfs
The way you can use it now is similar to this:
Task("Merge")
.Does(c => {
CreateAutoMergePullRequest(
new AutoMergeSettings
{
// Required
CollectionUri = "https://{instance}/{collection-name}",
ProjectName = "project-name",
RepositoryName = "repository-name",
SourceBranch = "refs/heads/release/1.0.0",
TargetBranch = "refs/heads/develop",
Title = "[Auto Merge from Release]",
// Optional
Description = "Brief description of the changes about to get merge",
// Control
DeleteSourceBranch = false,
SquashMerge = false,
OverridePolicies = true,
AutoComplete = true,
AutoApprove = true
});
});
Any suggestions, please use the GitHub issue tracker.
Hope this help!
Related
First off the setup in question:
A Jenkins Instance with several build nodes and on prem Azure-Devops server containing the Git Repositories.
The Repo in question is too large to always build on push for all branches and all devs, so a small workaround was done:
The production branches have a polling enabled twice a day (because of testing duration which is handled downstream more builds would not help with quality)
All other branches have their automated building suppressed. They still can start it manually for Builds/Deployments/Unittests if they so choose.
The jenkinsfile has parameterization for which platforms to build, on prod* all the platforms are true, on all other branches false.
This helps because else the initial build of a feature branch would always build/deploy locally all platforms which would take too much of a load on the server infrastructure.
I added a service endpoint for Jenkins in the Azure Devops, added a Buildvalidation .yml - this basically works because when I call the sourcebranch of the pull request with the merge commitID i added a parameter
isPullRequestBuild which contains the ID of the PR.
snippet of the yml:
- task: JenkinsQueueJob#2
inputs:
serverEndpoint: 'MyServerEndpoint'
jobName: 'MyJob'
isMultibranchJob: true
captureConsole: true
capturePipeline: true
isParameterizedJob: true
multibranchPipelineBranch: $(System.PullRequest.SourceBranch)
jobParameters: |
stepsToPerform=Build
runUnittest=true
pullRequestID=$(System.PullRequest.PullRequestId)
Snippet of the Jenkinsfile:
def isPullRequest = false
if ( params.pullRequestID?.trim() )
{
isPullRequest = true
//do stuff to change how the pipeline should react.
}
In the jenkinsfile I look whether the parameter is not empty and reset the platforms to build to basically all and to run the unittests.
The problem is: if the branch has never run, Jenkins does not already know the parameter in the first run, so it is ignored, building nothing, and returning with 0 because "nothing had to be done".
Is there any way to only run the jenkins build if it hasnt run already?
Or is it possible to get information from the remote call if this was the build with ID 1?
The only other thing would be to Call the Jenkins via web api and check for the last successful build, but in that case I would have have the token somewhere stored in source control.
Am I missing something obvious here? I dont want to trigger the feature branch builds to do nothing more than once, because Devs could lose useful information about their started builds/deployments.
Any ideas appreciated
To whom it may concern with similar problems:
In the end I used the following workaround:
The Jenkins Endpoint is called via a user that only is used for automated builds. So, in case that this user triggered the build, I set everything to run a Pull Request Validation, even if it is the first build. Along the lines of
def causes = currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause')
if (causes != null)
{
def buildCauses= readJSON text: currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause').toString()
buildCauses.each
{
buildCause ->
if (buildCause['userId'] == "theNameOfMyBuildUser")
{
triggeredByAzureDevops = true
}
}
}
getBuildcauses must be allowed to run by a Jenkins Admin for that to work.
We are using the BuildHTTPClient to programmatically create a copy of a build definition, update the variables in memory and then save the updated object as a new definition.
I'm using Microsoft.TeamFoundation.Build2.WebApi.BuildHTTPClient 16.141. The TFS version is 17 update 3 (rest api 3.x)
This is a similar question to https://serverfault.com/questions/799607/tfs-buildhttpclient-updatedefinition-c-example but I'm trying to stay within using the BuildHttpClient libraries and not go directly to the RestAPIs.
The problem is the Steps list is always null along with other properties even though we have them in the build definition.
UPDATE Posted as an answer below
After looking at #Daniel Frosts attempt below we started looking at using older versions of the NuGet package. Surprisingly the supported version 15.131.1 does not support this but we have found out that the version="15.112.0-preview" does.
After rolling back all of our Dlls to match that version the steps were cloned when saving the new copy of the build.
All of the code examples we used work when you are using this package. We were unable to get Daniel's example working but the version of the Dll was the issue.
We need to create a GitHub issue and report it to MS
First Attempt - GetDefinitionAsync:
VssConnection connection = new VssConnection(DefinitionTypesDTO.serverUrl, new VssCredentials());
BuildHttpClient bdClient = connection.GetClient<BuildHttpClient>();
Task <BuildDefinition> resultDef = bdClient.GetDefinitionAsync(DefinitionTypesDTO.teamProjectName, buildID);
resultDef.Wait();
BuildDefinition updatedDefinition = UpdateBuildDefinitionValues(resultDef.Result, dr, defName);
updatedTask = bdClient.CreateDefinitionAsync(updatedDefinition, DefinitionTypesDTO.teamProjectName);
The update works on the variables and we can save the updated definition back to TFS but there are not any tasks in the newly created build definition. When we look at the object that is returned from GetDefinitionAsync we see that the Steps list is empty. It looks like GetDefinitionAsync just doesn't get the full object.
Second Attempt - Specific Revision:
int rev = 9;
Task <BuildDefinition> resultDef = bdClient.GetDefinitionAsync(DefinitionTypesDTO.teamProjectName, buildID, revision: rev);
resultDef.Wait();
BuildDefinition updatedDefinition = UpdateBuildDefinitionValues(resultDef.Result, dr, defName);
Based on SteveSims post we were thinking we are not getting the correct revision. So we added revision to the request. I see the same issue with the correct revision. Similarly to SteveSims post I can open the DefinitionURL in a browser and I see that the tasks are in the JSON in the browser but the BuildDefinition object is not populated with them.
Third Attempt - GetFullDefinition:
So then I thought to try getFullDefinition, maybe that's that "Full" means of course with out any documentation on these libraries I have no idea.
var task2 = bdClient.GetFullDefinitionsAsync(DefinitionTypesDTO.teamProjectName, "MyBuildDefName","$/","TfsVersionControl");
task2.Wait();
Still no luck, the Steps list is always null even though we have steps in the build definition.
Fourth Attempt - Save As Template
var task2 = bdClient.GetTemplateAsync DefinitionTypesDTO.teamProjectName, "1_Batch_Dev");
task2.Wait();
I tried saving the Build Definition off as a template. So in the Web UI I chose "Save as Template", still no steps.
Fifth Attempt: Using the URL as mentioned in SteveSims post:
Finally i said ok, i'll try the solution SteveSims used, using the webclient to get the object from the URL.
var client = new WebClient();
client.UseDefaultCredentials = true;
var json = client.DownloadString(LastDefinitionUrl);
//Convert the JSON to an actual builddefinition
BuildDefinition result = JsonConvert.DeserializeObject<BuildDefinition>(json);
This also didn't work. The build definition steps are null. Even when looking at the Json object (var json) i see the steps. But the object is not loaded with them.
I've seen this post which seems to add the Steps to the base definition, i've tried this but honestly I'm having an issue understanding how he has modified the BuildDefinition Object when referencing that via NuGet?
https://dennisdel.com/blog/getting-build-steps-with-visual-studio-team-services-.net-api/
After looking at #Daniel Frosts attempt below we started looking at using older versions of the NuGet package. Surprisingly the supported version 15.131.1 does not support this but we have found out that the version="15.112.0-preview" does.
After rolling back all of our Dlls to match that version the steps were cloned when saving the new copy of the build.
All of the code examples above work when you are using this package. We were unable to get Daniel's example working but we didn't try hard as we had working code.
We need to create a GitHub issue for this.
Found this in my code, which works.
Use this package, not sure if it could have an impact (joke).
...packages\Microsoft.TeamFoundationServer.Client.15.112.1\lib\net45\Microsoft.TeamFoundation.Build2.WebApi.dll
private Microsoft.TeamFoundation.Build.WebApi.BuildDefinition GetBuildDefinition(string projectName, string buildDefinitionName)
{
var buildDefinitionReferences = _buildHttpClient.GetFullDefinitionsAsync(projectName, "*", null, null, DefinitionQueryOrder.DefinitionNameAscending, top: 1000).Result;
return buildDefinitionReferences.SingleOrDefault(x => x.Name == buildDefinitionName && x.DefinitionQuality != DefinitionQuality.Draft);
}
With the newer clients Steps will always be empty. In newer api-versions (which are used by the newer clients) the steps have moved to Phases. If you use GetDefinitions or GetFullDefinitions and look in
definition.Process.Phases[0].Steps
you'll find them. (GetDefinitions gets shallow references so the process won't be included.)
The Steps collection still exists for compatibility reasons (we don't want apps to crash with stuff like MethodNotFoundExceptions) but it won't be populated.
I was having this problem, although I able to get Phases[0] information at runtime, but could not get it at design time. I solved this problem using dynamic type.
dynamic process = buildDefTemplate.Process;
foreach (BuildDefinitionStep tempStep in process.Phases[0].Steps)
{
// do some work here
}
Not, it is working!
Microsoft.TeamFoundationServer.Client version 16.170.0 I can get build steps through process.Phases[0].Steps only with process and step being dynamic as #whitecore above stated
var definitions = buildClient.GetFullDefinitionsAsync(project: project.Name);
foreach (var definition in definitions.Result)
{
Console.WriteLine(string.Format("\n {0} - {1}:", definition.Id, definition.Name));
dynamic process = definition.Process;
foreach (dynamic step in process.Phases[0].Steps)
{
Console.WriteLine(step.DisplayName);
}
}
TFS 2012
VS 2012
I have large number of Build Definitions that are derived from single template.
I also have a Master Build to queue all those builds and pass arguments if I need to do so.
That part was done using TFS Community Extension: QueueBuilds.
Problem:
Is there a way to access Build Definitions, loop through them, (can get up to here by myself) and change their ProcessParameters and save them.
You can use the Microsoft.TeamFoundation.Build.Client and related assemblies to update the process parameters via PowerShell or C#. The one tricky part is the parameters are stored in XML so you have to deserialize, make your changes, then serialize again to set them.
This likely won't run without some tweaking but here are some snippets from one of my scripts that will help:
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.Build.Client')
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.Build.Workflow')
$projectCollection = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TFSUri)
$buildServer = $projectCollection.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer]) $buildServer = GetBuildServer -TFSUri $TFSUri
$buildDefinition = $buildServer.GetBuildDefinition($TeamProjectName, $BuildName);
...
$parameters = [Microsoft.TeamFoundation.Build.Workflow.WorkflowHelpers]::DeserializeProcessParameters($buildDefinition.ProcessParameters)
$msBuildArguments = $parameters.Get_Item("MSBuildArguments")
$msBuildArguments = "$msBuildArguments /p:ImportParametersFilesOverride=true"
$parameters.Set_Item("MSBuildArguments", $msBuildArguments)
$parameters.Add("GetVersion", "c$TFSChangeSetNumber")
$buildRequest.ProcessParameters = [Microsoft.TeamFoundation.Build.Workflow.WorkflowHelpers]::SerializeProcessParameters($parameters)
When you set up a Jenkins job various test result plugins will show regressions if the latest build is worse than the previous one.
We have many jobs for many projects on our Jenkins and we wanted to avoid having a 'job per branch' set up. So currently we are using a parameterized build to build eg different development branches using a single job.
But that means when I build a new branch any regressions are measured against the previous build, which may be for a different branch. What I really want is to measure regressions in a feature branch against the latest build of the master branch.
I thought we should probably set up a separate 'master' build alongside the parameterized 'branches' build. But I still can't see how I would compare results between jobs. Is there any plugin that can help?
UPDATE
I have started experimenting in the Script Console to see if I could write a post-build script... I have managed to get the latest build of master branch in my parameterized job... I can't work out how to get to the test results from the build object though.
The data I need is available in JSON at
http://<jenkins server>/job/<job name>/<build number>/testReport/api/json?pretty=true
...if I could just get at this data structure it would be great!
I tried using JsonSlurper to load the json via HTTP but I get 403, I guess because my script has no auth session.
I guess I could load the xml test results from disk and parse them in my script, it just seems a bit stupid when Jenkins has already done this.
I eventually managed to achieve everything I wanted, using a Groovy script in the Groovy Postbuild Plugin
I did a lot of exploring using the script console http://<jenkins>/script and also the Jenkins API class docs are handy.
Everyone's use is going to be a bit different as you have to dig down into the build plugins to get the info you need, but here's some bits of my code which may help.
First get the build you want:
def getProject(projectName) {
// in a postbuild action use `manager.hudson`
// in the script web console use `Jenkins.instance`
def project = manager.hudson.getItemByFullName(projectName)
if (!project) {
throw new RuntimeException("Project not found: $projectName")
}
project
}
// CloudBees folder plugin is supported, you can use natural paths:
project = getProject('MyFolder/TestJob')
build = project.getLastCompletedBuild()
The main test results (jUnit etc) seem to be available directly on the build as:
result = build.getTestResultAction()
// eg
failedTestNames = result.getFailedTests().collect{ test ->
test.getFullName()
}
To get the more specialised results from eg Violations plugin or Cobertura code coverage you have to look for a specific build action.
// have a look what's available:
build.getActions()
You'll see a list of stuff like:
[hudson.plugins.git.GitTagAction#2b4b8a1c,
hudson.scm.SCMRevisionState$None#40d6dce2,
hudson.tasks.junit.TestResultAction#39c99826,
jenkins.plugins.show_build_parameters.ShowParametersBuildAction#4291d1a5]
These are instances, the part in front of the # sign is the class name so I used that to make this method for getting a specific action:
def final VIOLATIONS_ACTION = hudson.plugins.violations.ViolationsBuildAction
def final COVERAGE_ACTION = hudson.plugins.cobertura.CoberturaBuildAction
def getAction(build, actionCls) {
def action = build.getActions().findResult { act ->
actionCls.isInstance(act) ? act : null
}
if (!action) {
throw new RuntimeException("Action not found in ${build.getFullDisplayName()}: ${actionCls.getSimpleName()}")
}
action
}
violations = getAction(build, VIOLATIONS_ACTION)
// you have to explore a bit more to find what you're interested in:
pylint_count = violations?.getReport()?.getViolations()?."pylint"
coverage = getAction(build, COVERAGE_ACTION)?.getResults()
// if you println it looks like a map but it's really an Enum of Ratio objects
// convert to something nicer to work with:
coverage_map = coverage.collectEntries { key, val -> [key.name(), val.getPercentageFloat()] }
With these building blocks I was able to put together a post-build script which compared the results for two 'unrelated' build jobs, then using the Groovy Postbuild plugin's helper methods to set the build status.
Hope this helps someone else.
The product I work on comprises 3/4 seperate (non-dependant) TFS builds.
I would like to create a single TFS build which queues the other 3/4 builds from within the ProcessTemplate AND, critically, pass process parameters to them. This build would wait for them all to complete and return an overall success/failure of the build.
So my questions are:
Can this be achieved by any existing 'standard' Workflow activities (my manager has had bad experiences with custom workflow activities)?
If not, I am able to 'shell out' to powershell. Can I achieve what I want from within Powershell (accessing the API)?
Maybe using TFSBuild.exe? But I can't find a way of passing the custom process parameters I need.
Any assistance or guidance would be appreciated.
UPDATE
The following powershell script will execute the build, but I'm still at a loss to be able to pass my custom process parameters :-(
function Get-BuildServer
{
param($serverName = $(throw 'please specify a TFS server name'))
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")
[void][System.Reflection.Assembly]::LoadWithPartialName ("Microsoft.TeamFoundation.Build.Client")
$tfs = [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory]::GetServer($serverName)
return $tfs.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
}
$buildserver = Get-BuildServer "http://tfsserver:8080/tfs/My%20Project%20Collection"
$teamProject = "ESI"
$buildDefinition = "iPrl_BuildMaster"
$definition = $buildserver.GetBuildDefinition($teamProject, $buildDefinition)
$request = $definition.CreateBuildRequest()
$buildserver.QueueBuild($request, "None")
Now after googling, I have found the following C# code to update the verbosity and, assuming it's the same for my custom process parameters, I need to convert this to work with the above powershell script. Any ideas?
IDictionary<String, Object> paramValues = WorkflowHelpers.DeserializeProcessParameters(processParameters);
paramValues[ProcessParameterMetadata.StandardParameterNames.Verbosity] = buildVerbosity;
return WorkflowHelpers.SerializeProcessParameters(paramValues);