I needed to change a work item field from PlainText -> String.
As I could not change the type on the Work Item, creating a new field and updating it's value from the other field is my approach.
I have tried the "Bulk Edit Selected Work Items.." from TFS/Web but I'm not sure if you may reference another field value in that template.
How may I set [Work Item].[FieldNew].Value = [Work Item].[FieldOriginal].Value ??
Is this even possible without having to use the TFD API?
The reason why I need to change the item field type from PlainText to String is that I want to have a query with a column operator to test if the field has value or not.
For a plainText field the only allowed operator is Contains/Does Not Contain. May I override this to allow a ">" ?
KMoraz's solution does not work for me either, because the HTML field becomes read-only when exported to Excel. Therefore, I used a Powershell script to copy the value of one field into another (just replace the "$wiFieldNewValue" variable with the source field you are copying)
Code reference: Bulk update TFS work items using Powershell
Link to code
Embedded code:
#This script sets a specific field to a specified value for all work items in a specific project
Function LoadTfsAssemblies() {
Add-Type –AssemblyName "Microsoft.TeamFoundation.Client, Version=12.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
Add-Type -AssemblyName "Microsoft.TeamFoundation.WorkItemTracking.Client, Version=12.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
}
##### SETTINGS
#The TFS Team Project Collection to connect to
$tfsUri = "http://tfsserver:8080/tfs/DefaultCollection"
#The TFS Team Project from which to select the work items
$tfsProject = "Test Project"
#The work item type of the work items to update
$wiType = "Test Case"
#The reference name of the field to update
$wiFieldRefName = "Microsoft.VSTS.Common.Priority"
#The value to set the field to
$wiFieldNewValue = "1"
##### END SETTINGS
LoadTfsAssemblies
$tfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($tfsUri)
$tfs.EnsureAuthenticated()
if($tfs.HasAuthenticated)
{
Write-Output "Successfully authenticated to TFS server [$tfsUri]"
$workItemStore = $tfs.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])
$query = "SELECT [System.Id], [System.Title] FROM WorkItems WHERE [System.TeamProject] = '{0}' AND [System.WorkItemType] = '{1}'" -f $tfsProject, $wiType
Write-Output("Using query [$query]")
$workItems = $workItemStore.Query($query)
Write-Output("Going to update [{0}] work items" -f $workItems.Count)
$successCount = 0
$failureCount = 0
ForEach($wi in $workItems) {
Write-Output("Updating work item [{0}]" -f $wi.Title)
try {
$wi.Open()
$wi.Fields[$wiFieldRefName].Value = $wiFieldNewValue
Write-Output("Set field [{0}] to [{1}]" -f $wiFieldRefName, $wiFieldNewValue)
$validationMessages = $wi.Validate()
if($wi.IsValid() -eq $true)
{
$wi.Save()
Write-Output("Successfully updated work item [{0}]" -f $wi.Title)
$successCount++
} else {
Write-Error("Work item is not valid!")
ForEach($validationMessage in $validationMessages)
{
Write-Error("Error: {0}" -f $validationMessage)
}
$failureCount++
}
} catch {
Write-Error("Couldn't set field [{0}] to [{1}] for work item [{2}]" -f $wiFieldRefName,$wiFieldNewValue,$wi.Title)
Write-Error $_
$failureCount++
}
}
Write-Output("Finished!")
Write-Output("Successfully updated: {0}" -f $successCount)
Write-Output("Failed to update: {0}" -f $failureCount)
} else {
Write-Error("Couldn't authenticate to TFS server [$tfsUri]")
}
I'ts possible via Excel.
Create a query with both old and new field columns visible.
Export the query to Excel.
Copy and paste the data from old field column to the new field.
In Excel, from the Team menu click Publish to update the changes in TFS.
Related
I think this can be achieved with behaviors but I am struggling with the code.
I am trying to make "Cascading list 2" mandatory when an option is picked from "Cascading list 1"
Eg:
On "Cascading list 1" if a user picks option "A" then they have to also fill out "Cascading list 2"
If they pick option "B" on "Cascading list 1" then "Cascading list 2" is not required.
This is some of the code I was playing around with:
def fieldA = getFieldByName('BI Reporting & Analytics Request Categories') //this is cascading list 1
def fieldC = getFieldByName('Reporting') //this is the cascading list 2
def fieldAValuesThatTriggerFieldCRequired = ['Reporting'] //this is the option choosen in cascading list 1
def valueA = fieldA.value
def fieldCIsRequired = valueA in fieldAValuesThatTriggerFieldCRequired
fieldC.setRequired(fieldCIsRequired)
Any assistance is appreciated.
Image on JIRA
Thanks.
If my understanding is correct, they are not cascading field.
The request here is that when field A has particular value saying 'aaa', then field B becomes required (mandatory).
This is a typical use case for Jira plugin Adaptavist ScriptRunner behaviours.
But Behaviours is only available for Jira server or data center version. It is not for Jira cloud.
If your Jira is server version, you can refer to below steps and scripts:
go to behaviours settings, if you don't have a behaviour item for
your workflow, please create it. If behaviour has been created,
click into action/edit.
choose the field A and add it.
click Add server-side script, you will see the black inline edit section.
add below code.
import com.onresolve.jira.groovy.user.FieldBehaviours
import com.onresolve.jira.groovy.user.FormField
import groovy.transform.BaseScript
#BaseScript FieldBehaviours fieldBehaviours
FormField field1 = getFieldById(getFieldChanged()) // we need to capture the change in field 1.
FormField pmAuthor = getFieldByName("field2")
if (field1.getValue()) { // to check if field1 has value as the action on field could be a deleting value operation.
if (field2.getValue()=="A") {
field2.setRequired(true)
} else {
field2.setRequired(false)
}
} else { // if the value was deleted, then remove the requirement.
field2.setRequired(false)
}
We need to merge two fields into one. In the config, there is a "doneMatch" special string, and this seems to get appended to the merged field. Why is this needed, and is there a way to not have it also appended to the target field?
For example, I have:
src.fieldA = "City"
src.fieldB = "State"
I want to merge these 2 fields into target.fieldA as "City: State". However, what I end up with is "City: State##DONE##" I can change the config file so that it uses a different doneMatch but it can't be null or empty.. so if I changed it to ";", then the resulting field would be "City: State;". I have to have an end done char/string for some reason. What is this used for? If I am synching the fields with newer updates, is it going to detect the previous ##DONE## in the target.fieldA and think it's already done the merge, so would not make any new changes?
Can someone send me more info on this feature?
I have updated the code for v9.0.1 that changes the way that the FieldMerge works. It no longer uses doneMatch and instead requires that all 3 fields are different and then skips if it has already been done.
if (source.Fields.Contains(config.sourceField1) && source.Fields.Contains(config.sourceField2))
{
var val1 = source.Fields[config.sourceField1].Value != null ? source.Fields[config.sourceField1].Value.ToString() : string.Empty;
var val2 = source.Fields[config.sourceField2].Value != null ? source.Fields[config.sourceField2].Value.ToString() : string.Empty;
var valT = target.Fields[config.targetField].Value != null ? target.Fields[config.targetField].Value.ToString() : string.Empty;
var newValT = string.Format(config.formatExpression, val1, val2);
if (valT.Equals(newValT))
{
Trace.WriteLine(string.Format(" [SKIP] field already merged {0}:{1}+{2} to {3}:{4}", source.Id, config.sourceField1, config.sourceField2, target.Id, config.targetField));
} else
{
target.Fields[config.targetField].Value = string.Format(config.formatExpression, val1, val2) + config.doneMatch;
Trace.WriteLine(string.Format(" [UPDATE] field merged {0}:{1}+{2} to {3}:{4}", source.Id, config.sourceField1, config.sourceField2, target.Id, config.targetField));
}
}
https://github.com/nkdAgility/azure-devops-migration-tools/pull/529
Test the field merge with Azure Devops Migration tools, I could also reproduce this situation. The doneMatch field is required in the configuration.json file(##Done## by default).
There seems to be no way to avoid adding donematch to the target field. Since I am not the developer of this tool, I am not sure about the function of this field.
I would like to share a workaround to solve this issue.
Workaround:
You could try to set the " " to the doneMatch field. ( "doneMatch": " ")
For example:
"FieldMaps": [
{
"ObjectType": "VstsSyncMigrator.Engine.Configuration.FieldMap.FieldMergeMapConfig",
"WorkItemTypeName": "*",
"sourceField1": "System.Description",
"sourceField2": "Microsoft.VSTS.Common.AcceptanceCriteria",
"targetField": "System.Description",
"formatExpression": "{0} {1}",
"doneMatch": " "
}
Since the configuration file is a Json file, you could use " " to represent spaces.
Result:
is it going to detect the previous ##DONE## in the target.fieldA and
think it's already done the merge, so would not make any new changes
Based on my test, the ##Done## in the target field will not affect other operations. You can still operate on this field.
Update:
The above method could only work on field type: Text (multiple lines). If the field is other types, this method doesn't work.
You could create a new field(Text (multiple lines)). Then you could set the target field as the new field.
e.g.
"FieldMaps": [
{
"ObjectType": "VstsSyncMigrator.Engine.Configuration.FieldMap.FieldMergeMapConfig",
"WorkItemTypeName": "*",
"sourceField1": "Custom.test1",
"sourceField2": "Custom.test2",
"targetField": "Custom.test3",
"formatExpression": "{0} {1}",
"doneMatch": " "
}
We are using Azure DevOps 2019 on-prem in our firm, and I would like to create an option box field in our Bug work item, and I want it to be a combo-box where the values are builds from all the build definitions under the project.
From checking the documentation of the system, I did not find any option to how to do it, and ether if it would be better to query the System through the API, or Query the DB.
I don't think there is a built-in feature like this.
What you can do, is to create a string field that takes the values from the gloabllist, in the globallist create in the first time a globallist with the project name, for example:
<GLOBALLIST name="MyProject-builds">
</GLOBALLIST>
Now you can use PowerShell to get the build definitions for this project, and update this globallist with the values:
Param(
[string]$collection = "http://tfs-server:8080/tfs/collection",
[string]$project = "MyProject",
[string]$filePath = "C:\Globallist.xml"
)
$url = "$collection/$project/_apis/build/definitions?api-version=4.0"
$builds = (Invoke-RestMethod -Uri $url -Method Get -UseDefaultCredentials -ContentType application/json).value.name
witadmin exportgloballist /collection:$collection /f:$filePath
[xml]$gloabllist = Get-Content $filePath
$gloabllist.GLOBALLISTS.GLOBALLIST.Where({ $_.name -eq "$project-builds" }).LISTITEM | %{ $_.ParentNode.RemoveChild($_) | Out-Null }
$node = $gloabllist.GLOBALLISTS.GLOBALLIST.Where({ $_.name -eq "$project-builds" })
$builds.ForEach({
$child = $gloabllist.CreateElement("LISTITEM")
$att = $gloabllist.CreateAttribute("value")
$child.Attributes.Append($att)
$child.value = "$_"
$node.AppendChild($child)
})
$gloabllist.Save($filePath)
witadmin importgloballist /collection:$collection /f:$filePath
You can set a scheduled build that tun this script each day to be updated all the time.
You can also improve the script to get all the projects, itreate them, get the build definitions names and update the globallist file.
I wanted to access TFS data via TFS API and WIQL using a custom field in WIQL where clause:
string wiqlQueryDoorsProxy =
"Select * from WorkItems where ([Work Item Type] = 'DoorsProxy' AND [Object Id] = '\" + requirementId + "\')";
where [Object Id] is the custom field. But TFS API gave exception message:
"TF51005: The query references a field that does not exist. The error is caused by «[Object Id]»."
The field definition has name = "Object Id" and reference name = "DoorsTool.DoorsArtifactType.ObjectId". I tried both Object Id and DoorsTool.DoorsArtifactType.ObjectId in the WIQL. Same result.
I changed the code as follows and it worked perfectly:
string wiqlQueryDoorsProxy ="Select * from WorkItems where ([Work Item Type] = 'DoorsProxy' )";
WorkItemCollection witCollectionDoorsProxy = wiStore.Query(wiqlQueryDoorsProxy);
foreach (WorkItem workItemDoorsProxy in witCollectionDoorsProxy)
{
workItemDoorsProxy.Open();
if (workItemDoorsProxy.Fields["Object Id"].OriginalValue.ToString() == requirementId)
{
...
}
}
But now the performance is bad.
What can I do. The problem looks similar to this but still I cannot solve the problem based on that discussion.
The way way is to create the query that you want in visual studio first. Once you have It working you can "save as" to local disk and open It in notepad. You can just copy the query from there.
I use SqlExecute activity of Build Extension
I wish pass params in arguments of my activity
But my problem is Te parameters are not injected into the SQL script, can you help me please. have you used SqlExecute activity?
Parameters Section in editor, in my build designer of template tfs / wwf :
New String(5) {
"BackupFilePath = F:\MSSQL10.IS_CLTLIVE_DE\DUMP\ClearProdDump\CP.dmp",
"LogicalDataFile = CP_Data01",
"DataFilePath = F:\MSSQL10.IS_CLTLIVE_DE\data\",
"LogicalLogFile = CP_TLog01",
"LogFilePath = F:\MSSQL10.IS_CLTLIVE_DE\log\",
"DatabaseName = AghilasCP_Tmp"
}
My Script.sql
IF EXISTS (SELECT name FROM sys.databases WHERE name = $(DatabaseName ))
Reading the code of that activity it seems that you need to specify the parameters as
New String(5) {
"#BackupFilePath='F:\MSSQL10.IS_CLTLIVE_DE\DUMP\ClearProdDump\CP.dmp'",
"#LogicalDataFile='CP_Data01'",
"#DataFilePath='F:\MSSQL10.IS_CLTLIVE_DE\data\'",
"#LogicalLogFile='CP_TLog01'",
"#LogFilePath='F:\MSSQL10.IS_CLTLIVE_DE\log\'",
"#DatabaseName='AghilasCP_Tmp'"
}
And the statement as
IF EXISTS (SELECT name FROM sys.databases WHERE name = #DatabaseName)
With no spaces around the '=', with a unique name as a parameter name, this scares me a bit, I'd personally have written built this activity differently, and with quotes around the parameter value either in the parameter array or in the SQL code.