We have a project that creates and edits pdfs on the file system. When we fire a tfs build, it deletes the generated pdfs in the folder (and we need them to stay). Is there a way to tell the build to exclude deleting the pdf's that are in the folder every time we do a deployment?
i was a google click away - sorry. the answer was in msbuild arguments. /p:DeleteExistingFiles=False
/p:SkipExtraFilesOnServer=True
/p:ReplaceMatchingFiles=True
Related
We currently use TFS 2010 for controlling some of our .NET projects.
I'd like to know whether it's possible to provide some kind of versioning within TFS for a specific folder on a network share for example.
The idea being is to control release packages (zips) for every release we do. As this bit is done by some other person (not technical and he doesn't use Visual Studio or any tech tool), it'd be great to streamline the process of versioning the zip files for every release.
Technically speaking:
We've setup this folder \servername\releasezips and every time I drop a file named release.zip, it would automatically commit (push) this file to the TFS server (no comments needed) and furthermore, once the file contents change (meaning that someone dropped another version of the file in there), the system would again push the file to TFS, but with a following version.
Is this possible somehow? I've seen somewhere that I could programatically have some extra control over TFS, using REST API.
Thanks in advance!
Thanks for your tips, Daniel and PatrickLu-MSFT.
As I said, I did want something straighforward, some kind of control similar to what box.com provides, where you associate a local folder on your machine with the cloud. So, once you drop files in the local folder, the small box client synchronizes it along with the cloud. If access the box.com, I can see the different versions of that specific file.
So what I did was, I've created a small .NET app to monitor the folder and any new files dropped in there would get checked in to our TFS server, by using the tf command line (of course ignoring work items or comments).
So, progamatically within the .NET app, it builds up the tf required commands for processing and versioning the recently dropped files and therefore, executes them.
By the way, I could have used the Team Foundation API in order to do the same job, but it would demand way more effort.
Cheers
According to your description, you want to commit/check in files during the build/release pipeline.
It's not a recommend way to check-in generated build files and modify source code during a build pipeline. If you really want to do this, you could edit the build workspace files and use tf commands in custom activity and call the powershell eq:
cd $env:BUILD_SOURCESDIRECTORY
$TFFile = Get-Item "C:\Program Files (x86)\Microsoft Visual Studio 1x.0\Common7\IDE\TF.exe"
$tfOutput = [string]( & $TFFile.FullName checkin /noprompt /override:"***NO_CI*** New version is $newVersion." /comment:"***NO_CI*** New version is $newVersion." 2>&1)
Another way is installing TFS Power Tool and use the Windows PowerShell Cmdlets to check in the files. Refer to this link for more details: PowerShell and TFS: The Basics and Beyond
For version the dropped files, you could take a look this similar question: TFS Build Copy to Versioned Folder
Basically, you have to customize build definition with custom activity and based on build.buildnumber variable to generate/create .zip file.
I'm currently working on a custom build process and have a problem with the understanding of drop folder. The process creates only some reports which we need for further development. We are using TFS/VS 2012.
What I actually need is either a folder on the buildserver where the Reports are going to be saved and accesable from everyone or on each team members local machine a drop folder. In my opinion the first solution should be better but is it possible, since I'm the only one out of my team which has Access to the buildserver.
If I select the UNC path to the drop folder on the buildserver can other People without Access to this Server use the "Open Drop Folder" button? And how do I even copy files to the drop folder? Is it enough to just do CopyDirectory? In the moment it always says: "This build did not produce any Outputs. The drop Location field is empty..."
I would be really glad if someone could help me.
The purpose of the drop folder is to copy the result of the build from the working folder of the agent and somewhere where it will not be changed by a later build. The share can be on the build server or on another server. No matter what you need to control the permissions to that folder so that whomever needs access has it. Otherwise the Open Drop Folder button will not work.
You can enable so that the build results are copied to the drop location upon build completion. This is a bit different whether you use Build vNext or XAML build. Since you are using TFS 2012, I guess that you are using XAML builds. Drop location is specified under the Build Defaults tab of the build definition.
The output folder from your build will be copied so assemblies, test results etc. will be copied to the drop location when enabled. If there is something extra that you want to be included in your drop, then copy it to the output folder or create it there directly.
I have a folder structure setup for my code like so:
MyCodeFolder
-SolutionFileOne.sln
-ProjectFolder1
-ProjectFolder2
-ProjectFolder3
-SolutionFileTwo.sln
-ProjectFolderA
-ProjectFolderB
-ProjectFolderC
-ProjectFolderCommon
Solution one contains projects 1,2,3 and Common and Solution two contains project A, B ,C and Common.
When I come to create my TFS Builds I am getting a problem. If I just add MyCodeFolder in the working folder set up then both builds will succeed but then check-ins against project 2 will kick off a build of solution two and vice versa.
If I map only the folders the solution needs the build fails, which I am guessing is down to the fact I haven't included a mapping to the folder where the solution file is (the MyCodeFolder).
Is there a way I can solve this issue without altering my file structure?
The continuous integration trigger in TFS builds will queue a new build any time an item within that build's workspace is altered. Workspace mappings can only contain folders - you cannot include \ exclude (aka "cloak") individual files within folders.
What you can do is setup your build workspace to use the entire /MyCodeFolder folder. Then, in the build for SolutionFileOne.sln you can cloak ProjectFolderA, ProjectFolderB, and ProjectFolderC. In the build for SolutionFileTwo.sln you can cloak ProjectFolder1, ProjectFolder2, and ProjectFolder3.
This is only a partial solution. Both builds will still get kicked off when someone changes either solution file, or when anything in the ProjectFolderCommon folder is changed. Since you can't cloak the solution files themselves there's no way to avoid both builds getting kicked off on a solution file check-in without changing the structure of your files.
we have setup TFS Build for our project, but on every build the system copies the whole repository and then compiles our solution. How can we make sure TFS Build only downloads the files needed for the solution without having to cloak each un-needed directory manually ? Now it downloads over 2GIGs of data just to compile a project that is less then 100mb in size (source files). The other data are test databases and files that are not needed for the automatic build.
EDIT:
some further investigation let me to some keywords for searching. These posts are helping out:
Team Build - Get Workspace - get latest from specific paths, NOT everything
TFS Build and workspace
still investigating though. Any comments are welcome.
EDIT:
An option is to replace CreateWorkspace in the Build process definition with my own extended activity. I'm hoping to find out that somebody already did.. basically you would use the VersionControlServer object to download the necessary files instead of the whole workspace.
EDIT
There is currently no real good answer / solution to this. I gave some options and the people that responded gave some alternatives, but you can't easily change the TFS Build process to just download the data that is part of the solution instead of the whole repository. So be aware when you are building your repository.
You want to set the Build Definition mapping to only include the source you wish to compile. This means that you don't have to cloak any thing.
Edit Build Configuration
Click on Source Settings (VS 2012), Workspace (VS 2010)
An example specific mapping would look like this:
StatusSource Control Folder Build Agent Folder
Active$/Path/To/The/SolutionOrProjectFolder $(SourceDir)\
This will make the workspace for this build be limited to the solution that you wish to build. Therefore only AssemblyInfo files under that will be visible to your build activity.
If you cannot do this due to how your source control is setup, then I would suggest restructuring your folders within your Source Control.
If you have more than one Build Agent, you should limit the number of agents that the build definition can run against. That will stop multiple copies of the same source been downloaded on to the build machine(s).
The next part you have already answered in you question, by changing the "Clean Workspace" option in your Build Definition to None the build agent will only download the changesets between the current and last build.
I have a TFS server and I often work from two places. I'd like to have a folder that I just keep my random PSDs, mockups, etc in. Maybe even text file notes, or whatever.
The problem is, when I "Check in" a project, it only includes files that are included in the solution. Is there a simple way to have a folder always included?
For instance right now I just have a "mockups" folder in the root of my Team Project (above any individual project folders), but it's not part of any project or solution (I don't really want to publish a few megs of PSD files every time I publish my project).
You can create a Solution folder in your Solution and add the files as an existing file.
(or what is a solution folder)
Do have the psd and mockups something to do with your code?
If not I wouldn't recommend to add the files to the solution.
I would use the Windows Explorer AddIn from TFS Power Tools.
With it you have the ability to checkin/add/checkout files without Visual Studio, you only need to map a folder to your source control.
You can choose on the pending changes window whether you want to have all files checked in from your workspace or from your solution.
See How to show pending changes only for the currently opened solution in Visual Studio 2010 (TFS) and not the complete list of all changes? for a screenshot.