I am trying to automate creating TFS scenarios without the use of the UI dialogs that appear during merge, rollback, resolve, etc. I have a case where I neither want to AcceptYours or AcceptTheirs. I want to accept a manual merge as if the UI appeared and gave the user the chance to edit the file and accept the merge; but it's gotta be automated, no UI.
In the code below I create a 'copy' file that contains the final content I want as the manual merge for a rollback of a particular edit. As expected I get a conflict. I've tried all the tf resolve /auto: options and I can't resolve the conflict so the content from my 'copy' file is taken as though the user edited the conflict and accepted the manual merge.
KeepYours - will Undo the merge
TakeTheirs takes the server version
AutoMerge(Forced) - doesn't resolve the conflict
OverwriteLocal - doesn't resolve the conflict
AcceptYours - takes the version on disk but changes the change type from rollback,edit to edit
How can I reproduce a manual merge using the command line tf tool only without using the UI?
Here's an example to repro:
SET WSPATH=C:\MyMappedWorkspacePath
SET F=%WSPATH%\%RANDOM%%RANDOM%
SET TF=%ProgramFiles(x86)%\Microsoft Visual Studio 11.0\Common7\IDE\TF.exe
SET TMPFILE=%TEMP%\tf-resolve-test.txt
ECHO Hello > "%F%"
"%TF%" add "%F%"
"%TF%" checkin /comment:"add file" /noprompt "%F%" > %TMPFILE%
"%TF%" checkout "%F%"
ECHO // Change 1 >> "%F%"
COPY "%F%" "%F%-copy" > nul
"%TF%" checkin /comment:"edit 1" /noprompt "%F%" > %TMPFILE%
"%TF%" checkout "%F%"
ECHO // Change 2 >> "%F%"
"%TF%" checkin /comment:"edit 2" /noprompt "%F%" > %TMPFILE%
FOR /f %i IN ('PowerShell.exe -Command "select-string -Path %TMPFILE% -Pattern 'Changeset #(?<changeset>[0-9]*) .*' | %{$_.Matches} | %{$_.Groups['changeset']} | %{$_.Value}"') DO SET ROLLBACKCHANGESET=%i
"%TF%" checkout "%F%"
ECHO // Change 3 >> "%F%"
ECHO // Change 3 >> "%F%-copy"
"%TF%" checkin /comment:"edit 3" /noprompt "%F%" > %TMPFILE%
"%TF%" rollback /changeset:%ROLLBACKCHANGESET%~%ROLLBACKCHANGESET% "%F%" /keepmergehistory /noautoresolve /noprompt
"%TF%" resolve "%F%" /auto:DeleteConflict
DEL /F "%F%"
MOVE "%F%-copy" "%F%"
ATTRIB +R "%F%"
"%TF%" rollback /changeset:%ROLLBACKCHANGESET%~%ROLLBACKCHANGESET% "%F%" /keepmergehistory /noautoresolve /noprompt
"%TF%" resolve "%F%" /auto:AutoMergeForced /noprompt
You place to edited contents on disk, in the appropriate location, then you resolve with AcceptYours.
AcceptYours means to take your contents, as they appear on disk, not the contents of the source of the conflict.
I would also investigate the (command line) only option of /discard.
http://teamfoundation.blogspot.com/2007/03/discarding-changes-in-merge.html
The best way to do that is to use discard option of the merge command. The option is available only through command-line client (tf.exe) and basically performs merge without taking any changes from the source to target; its only purpose is to update the merge history between source and target and thus prevent the discarded changeset appearance in the future.
"tf.exe" merge /recursive /noprompt /discard /version:C1001~C1001 "$Source/" "$Destination/"
or
"tf.exe" merge /recursive /noprompt /discard "$Source/" "$Destination/"
Related
Do you know how inject commit id into file version, so every assembly would heve version like 2.0.6565.0 where 6565 is related to C6565commit ID in TFS ?
It looks some power shell script is needed.
If your question is similar to your another post TFS 2015. the $(var.SourceLocation) variable is not available at gated-check in, that want to get the changeset id that hasn't checked in during gated check-in, then it's impossible in a single build.
If you don't use gated check in, then you can use $Env:BUILD_SOURCEVERSION in a powershell script to set the AssemblyVersion. Here is already a script at the website below, you can refer to it:
https://github.com/wulfland/ScriptRepository/blob/master/TFSBuild/TFSBuild/AssemblyVersion/Set-AssemblyVersion/Set-AssemblyVersion.ps1
Finally I created my own PS script based on this post.
The idea update version in all files with assembly info
$CommitId = ([string]$env:BUILD_SOURCEVERSION) -replace "[^0-9]+", ""
$AllVersionFiles = Get-ChildItem $SourceDir AssemblyInfo.cs -recurse
$regexToFindVersion = "Version\(""([0-9]+)\.([0-9]+).+"""
foreach ($file in $AllVersionFiles)
{
Write-Host "Processing " $file.FullName
(Get-Content $file.FullName) |
%{$_ -replace $regexToFindVersion, ('Version("$1.$2.0.' + $CommitId + '"') } |
Set-Content $file.FullName -Force
}
Full script can be found here.
The script must be placed before building project:
We are working on several ASP.NET MVC C# projects within Visual Studio 2015 and Team Foundation Server 2013. Sometimes the NuGet upgrade process is a mess and some of the replaced files (mostly *.png, *.gif, *.ttf) have not been checked-in properly.
What we have figured out so far: the check-in process gets into trouble, if directories should be removed and created in one step. You have to check-in twice. The problem is, if you don’t know about this and one of our developers retrieves the latest source, there are missing files. Visual Studio indicates that with a warning icon in Solution Explorer.
My question is: Is it possible to validate, if every file is present during gated-checkin or nightly-build on TFS which is linked in the csproj file? At least there should be a warning during build.
Hint: This is only a problem with files, which are not compiled (*.cs files) or do not have the setting “copy during build into output directory”. This happens e.g. with JS-files, which are bundled.
Final solution:
Write-Host "Check availability for all referenced files in all projects ..."
function Check-Files($directory, $files){
if (!$directory.EndsWith("/")) { $directory = "$($directory)/" }
ForEach($file in $files){
if($file){
Write-Host " Referenced file $($directory)$file"
if(-not (Test-Path "$($directory)$($file)")){
throw [System.IO.FileNotFoundException] "$($directory)$($file) not found."
}
}
}
}
function CheckProjectFile($csprojFile){
[xml]$projectContent = Get-Content $csprojFile
Write-Host "Checking project: $($csprojFile) ..."
$directory = Split-Path $csprojFile
ForEach($itemGroup in $projectContent.Project.ItemGroup){
Check-Files -files $itemGroup.Reference.HintPath -dir $directory
Check-Files -files $itemGroup.Compile.Include -dir $directory
Check-Files -files $itemGroup.None.Include -dir $directory
Check-Files -files $itemGroup.Content.Include -dir $directory
Check-Files -files $itemGroup.TypeScriptCompile.Include -dir $directory
Check-Files -files $itemGroup.ProjectReference.Include -dir $directory
}
}
$csprojFiles = Get-ChildItem -Path ./ -Recurse *.csproj | Select-Object -Property FullName
ForEach($file in $csprojFiles){
CheckProjectFile($file.FullName)
}
I added the script file to my Team Project on TFS, changed my build definition, added the script directory to my "Source Settings" and included the script into "Pre-build script path". Done!
I created a small piece of powershell that you can execute as a step before building the solution.
function Check-Files($files){
ForEach($file in $files){
if($file){
Write-Host "looking for $file"
if(-not (Test-Path $file)){
throw [System.IO.FileNotFoundException] "$file not found."
}
}
}
}
[xml]$projectContent = Get-Content ./your.csproj
ForEach($itemGroup in $projectContent.Project.ItemGroup){
Check-Files -files $itemGroup.Reference.HintPath
Check-Files -files $itemGroup.Compile.Include
Check-Files -files $itemGroup.None.Include
Check-Files -files $itemGroup.Content.Include
Check-Files -files $itemGroup.TypeScriptCompile.Include
Check-Files -files $itemGroup.ProjectReference.Include
}
Hope this helps you.
Kind regards
Jan
First of all, make sure you check in the whole solution/project every time. In this way, all the edit files in this solution/ project will list in the Included pending changes. If you only check in a single file, then other edits will list in Excluded changes and won't be checked in.
Also you can use check-in policy to prevent check-in without a review. Here's an existing check-in policy that requires code review before check-in:
https://visualstudiogallery.msdn.microsoft.com/c476b708-77a8-4065-b9d0-919ab688f078
I use the following command to merge a single changeset from Source to target branch:
result = BatchCommand(#"tf merge /version:" + chgnumber + "~" + chgnumber + #" """ + Source + #""" """ + Target + #""" /recursive /login:" + UID + "," + PWD + "", SourceTar[2]);
BatchCommand is another method which executes the command in cmd in my workspace SourceTar[2].
in some cases I get the error where I need to overwrite files. How can I do this automatically (Overwrite files).
Should I use /force for that? It definetly will resolve that overwrite conflict but will it also resolve other conflict(I don't want that).
I only want to overwrite files if that error occurs, other conflicts are resolved programmatically. Any suggestion would be helpful;
You need to work with tf resolve command to resolve conflicts. Your commands can be the similar to:
tf merge $/TeamProjectRoot/Branches/Source $/TeamProjectRoot/Branches/Target
tf resolve $/TeamProjectRoot/Branches/Target /r /i /auto:TakeTheirs
/auto:TakeTheirs option accepts the changes from the source of the merge and overwrites the changes in the target.
/auto:KeepYours option discards the changes from the source of the merge and leaves the target unchanged.
Whilst I'm aware that there is a command line tool to permenantly delete a TFS work item. (e.g. How to delete Work Item from Team Foundation Server)
Has anyone been able to achieve the same action programatically using the TFS 2010 API DLLs?
Shai Raiten has blogged about this here, where he makes use of DestroyWorkItems(ids).
It is advisable that you proceed with extra caution in your implementation, since this can severely mess your installation. One could argue that constructing such a tool deviates from best practices.
You can also use PowerShell to bulk delete work items:
Copy and paste below script in a PowerShell file (with .ps1
extension), update the variable values mentioned in the list #4 below
and run the command from a machine where witadmin tool is installed
(Generally available after visual studio installation). Open
PowerShell command window and execute the script.
Note: Account running below script should have team foundation administrator or collection administrator access.
########TFS Work Items Bulk Destroy Automation Script##########
#Notes:
#1) This script requires to setup file/folder path, validate the file/folders path before running the script
#2) start the powershell window as Administrator and run the script
#3) This script requires share and admin access on the destination server, make sure your account or the account under which script is
# executing is member of admin group on the destination server
#4) Update following:
# 4.1: $CollectionURL
# 4.2: $WitAdmin tool location
# For VS 2015, Default location is C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE
# For VS 2013, Default location is C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE
# 4.3: $WI_List
# 4.4: $logfile
####################
$CollectionURL = "http://tfs:8080/tfs/CollectionName"
$WitAdminLocation = "C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE"
$WI_List = Get-Content "C:\WI_List.txt"
$logfile="C:\log.txt"
$ExecutionStartTime = Get-Date
$WICount = 0
"## Starting WI Destroy # $ExecutionStartTime ##"| Out-File $logfile -Append
"Collection URL: $CollectionURL" | Out-File $logfile -Append
foreach ($WIID in $WI_List)
{
CD $WitAdminLocation
.\witadmin destroywi /collection:$CollectionURL /id:$WIID /noprompt
"WI ID: $WIID Destroyed" | Out-File $logfile -Append
$WICount = $WICount + 1
Write-Host "$WICount Work Items Deleted"
}
$ExecutionEndTime = Get-Date
"## WI Destroy Command Completed # $ExecutionEndTime ##"| Out-File $logfile -Append
$TotalExecutionTime = $ExecutionEndTime - $ExecutionStartTime
"Total Work Items Deleted: $WICount" | Out-File $logfile -Append
" Total Execution Time: $TotalExecutionTime" | Out-File $logfile -Append
##End of script##
I'm maintaining a quite large legacy application. The source tree is a real mess.
I'm trying to set up a build server.
On the source tree, I've third party component with sources (also in the project's include path). These components are also installed within the IDE.
My question is :
How to manage those components ?
I thought to manage this way:
Install the IDE on the build server
Install all the third party component
Remove the component sources from the project sources tree (and keep them on the project root on a dedicated folder each zipped)
Each time we need to customize (or debug) a third party component we re-build the package and re-install it in the IDE of the build server (and on each developers workstation)
What's the difference between having the components installed in the IDE and having the sources in the include path ? How the linker handle that case ?
We have set up our daily builds using simple command files.
Each project (.dpr) has an associated Build.cmd file.
All Build.cmd files are called from our main BuildServerRun.cmd file.
The BuildServerRun.cmd file takes care of
Deleting the entire source tree on the buildserver.
Getting a latest version from our source control repository.
Call each Build.cmd and pipe the output to a file.
Mail the results to all developers.
All paths to external components are configured in the dcc32.cfg file
..
-u"c:\Program files\Developer Express Inc\ExpressInplaceEditors\Delphi 5\Lib"
-u"c:\Program files\Developer Express Inc\ExpressQuantumGrid\Delphi 5\Lib"
..
-r"c:\Program Files\Borland\Delphi5\Lib"
-r"C:\Program Files\jvcl\jvcl\resources"
..
-i"C:\Program Files\jvcl\jvcl\run"
-i"C:\Program Files\jvcl\jcl\source"
Example of a Build.cmd.
Note: we have a policy to compile to bin\dcu, exe to bin, hence the -N, -E directives.
#echo on
dcc32speed -B -Q -W -H -Nbin\dcu -Ebin BpABA.dpr
#echo off
Example of snipped BuildServerRun.cmd
SET %Drive%=E:
:BuildServer
REM *************************************************
REM Clear files
REM *************************************************
ECHO. > "%Temp%\BuildLieven.txt"
ECHO. > "%Temp%\TestRunLieven.txt"
REM *************************************************
REM Set start time
REM *************************************************
echo | TIME | FIND "Huidige tijd" > "%Temp%\ResultLieven.txt"
REM *************************************************
REM Get latest versions
REM *************************************************
IF %LatestVersion%==1 CALL %Drive%\buildserver\latestversion.cmd
ECHO "Latest versions opgehaald" >> "%Temp%\ResultLieven.txt"
REM *************************************************
REM Build projects
REM *************************************************
CD %Drive%\Projects\
ECHO ***************************************************************** >> "%Temp%\BuildLieven.txt"
ECHO BpABA >> "%Temp%\BuildLieven.txt"
ECHO ***************************************************************** >> "%Temp%\BuildLieven.txt"
CD %Drive%\Projects\BPABA\production
ECHO Building BPABA\production
CALL Build.cmd >> "%Temp%\BuildLieven.txt"
CD %Drive%\Projects\BPABA\test
ECHO Building BPABA\test
CALL Build.cmd >> "%Temp%\BuildLieven.txt"
CD %Drive%\Projects\BPABA\test\dunit
ECHO Building BPABA\test\dunit
CALL Build.cmd >> "%Temp%\BuildLieven.txt"
ECHO BPABATests >> "%Temp%\TestRunLieven.txt"
ECHO Running BPABATests
CALL bin\BPABATests >> "%Temp%\TestRunLieven.txt"
CD %Drive%\Projects
ECHO. >> "%Temp%\BuildLieven.txt"
ECHO. >> "%Temp%\BuildLieven.txt"
ECHO. >> "%Temp%\BuildLieven.txt"
REM *****************************************************************
REM Gather (Fatal)Errors/Hints/Warnings & Failures
REM *****************************************************************
ECHO ***************************************************************** >> "%Temp%\ResultLieven.txt"
ECHO (Fatal)Errors/Hints/Warnings en Failures >> "%Temp%\ResultLieven.txt"
ECHO ***************************************************************** >> "%Temp%\ResultLieven.txt"
ECHO Fatal errors during build >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND /c "Fatal:" >> "%Temp%\ResultLieven.txt"
ECHO Errors during build >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND /c "Error:" >> "%Temp%\ResultLieven.txt"
ECHO Warnings during build >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND /c "Warning:" >> "%Temp%\ResultLieven.txt"
ECHO Hints during build >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND /c "Hint:" >> "%Temp%\ResultLieven.txt"
ECHO Failures during test >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\TestRunLieven.txt" | FIND /c "Failures:" >> "%Temp%\ResultLieven.txt"
ECHO. >> "%Temp%\ResultLieven.txt"
ECHO ***************************************************************** >> "%Temp%\ResultLieven.txt"
ECHO Controle #Projecten = #Compiles >> "%Temp%\ResultLieven.txt"
ECHO ***************************************************************** >> "%Temp%\ResultLieven.txt"
ECHO #Projecten >> "%Temp%\ResultLieven.txt"
TYPE "%Drive%\buildserver\buildserverrun.cmd" | FIND /i "cmd >> " | FIND /i "Lieven" | FIND /i /v /c "FIND /i /v /c" >> "%Temp%\ResultLieven.txt"
ECHO #Compiles >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\buildLieven.txt" | FIND /i /c "dcc32" >> "%Temp%\ResultLieven.txt"
ECHO #Tests expected to run >> "%Temp%\ResultLieven.txt"
TYPE "%Drive%\buildserver\buildserverrun.cmd" | FIND /i "TestRunLieven" | FIND /i "CALL" | FIND /i /v /c "FIND /i /v /c" >> "%Temp%\ResultLieven.txt"
ECHO #Tests actually run >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\TestRunLieven.txt" | FIND /i /c "DUnit / Testing" >> "%Temp%\ResultLieven.txt"
ECHO. >> "%Temp%\ResultLieven.txt"
ECHO. >> "%Temp%\ResultLieven.txt"
ECHO ***************************************************************** >> "%Temp%\ResultLieven.txt"
ECHO Detail (Fatal)Errors/Hints/Warnings en Failures >> "%Temp%\ResultLieven.txt"
ECHO ***************************************************************** >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND "Fatal:" >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND "Error:" >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND "Warning:" >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\BuildLieven.txt" | FIND "Hint:" >> "%Temp%\ResultLieven.txt"
TYPE "%Temp%\TestRunLieven.txt" | FIND "Failures:" >> "%Temp%\ResultLieven.txt"
REM *************************************************
REM Set stop time
REM *************************************************
ECHO | TIME | FIND "Huidige tijd" >> "%Temp%\ResultLieven.txt"
REM *************************************************
REM Send results
REM *************************************************
CALL %drive%\buildserver\Blat.cmd
My answer is more general than Lieven's answer, which is Delphi-specific. I wrote this one shortly after the question, but went to a co-worker before submitting ;)
I refuse to install any IDE on our main Windows build agent. Sounds like a nightmare to me. The MSBuild engine handles all build scenarios well, and other than .NET, you just need the Windows SDK installed. Or you could use NAnt and even CMake, whatever. Just don't install IDEs. It's not fun on build servers.
Now you have tagged this as Delphi. I don't know how good it works there, but as Lieven wrote, Delphi comes with a command line compiler. I just don't have any experience how it works with third-party compnents, but I think Delphi supports MSBuild in the latest version.
I'm also unsure whether including thiry-party components into version control is a good thing, because of the space it takes - though you can also put them somewhere else and include them as external, which makes it much smaller, but also imposes the problem that upgrading the components for one app will upgrade them for all - so you better have good integration tests. But that is the point of a build server anyway.
Apart from that, it is always a good thing to check-out and have all components required to build the application available. You don't need to install components into an IDE if they were made well. Depending on what components they are, in many cases you don't even need to install them on developer machines. Many .NET components, for example, are available in the designer when you add a reference to them. And licensing is typically no more than "put the license file into the same directory". Well, that's how it should be, at least. If that's not how it works in Delphi today, that's likely one of the reasons Delphi is on its way out. Other than the Borland/Inprise/DevCo/Codegear/Embarcadero hassle.
Similar situation here, fortunately it did not start with a big mess. It is true that the real problem lies in the IDE configuration, which needs to point to the correct versions of the third party components if you check out an older version. The only solution I have heard of is the use of different registry branches for the different product release configurations.
The components are kept in a separate directory structure, and use version numbers in the directory names when possible. This makes it easier to check out old versions and have the build scripts point to the correct version.
We use Apache Ant as the main build tool for years now and it really does everything we need, including unit test invocation and Innosetup script generation.
Compilation of packages is only necessary you ship the executable with BPLs, otherwise it is not necessary on a build server.
Installing the components in the IDE is also not necessary on the build server.
You can use Owly CI tool.
It allow build projects in easy way by defining manifest file.
It also allows handle dependencies - you can wrap 3rd party components in owlyci packages and mark them as dependencies to the main project.
There is an example, how to use it with Jenkins CI system.