TFS MSBuild Copy Files from Network Location Into Build Directory - tfs

We are using TFS to build our solutions. We have some help files that we don't include in our projects as we don't want to grant our document writer access to the source. These files are placed in a folder on our network.
When the build kicks off we want the process to grab the files from the network location and place them into a help folder that is part of source.
I have found an activity in the xaml for the build process called CopyDirectory. I think this may work but I'm not sure what values to place into the Destination and Source properties. After each successful build the build is copied out to a network location. We want to copy the files from one network location into the new build directory.
I may be approaching this the wrong way, but any help would be much appreciated.
Thanks.

First, you might want to consider your documentation author placing his documents in TFS. You can give him access to a separate folder or project without granting access to your source code. The advantages of this are:
Everything is in source control. Files dropped in a network folder are easily misplaced or corrupted, and you have no history of changes to them. The ideal for any project is that everything related to the project is captured in source control so you can lift out a complete historical version whenever one is needed.
You can map the documentation to a different local folder on your build server such that simply executing the "get" of the source code automatically copies the documentation exactly where it's needed.
The disadvantage is that you may need an extra CAL for him to be able to do this.
Another (more laborious) approach is to let him save to the network location, and have a developer check the new files into TFS periodically. If the docs aren't updated often this may be an acceptable compromise.
However, if you wish to copy the docs from the net during your build, you can use one of the MSBuild Copy commands (as you are already aware), or you can use Exec. The copy commands are more complicated to use because they often populated with filename lists that are generated from outputs of other build targets, and are usually used with solution-relative pathnames. But if you're happy with DOS commands (xcopy/robocopy), then you may find it much easier just to use Exec to run an xcopy/robocopy command. You can then "develop" and test the xcopy command outside the MSBuild environment and then just paste it into the MSBuild script with confidence that it will work - much easier than trialling copy settings as part of your full build process.
Exec is documented here. The example shows pretty well how to do what you want, but in your case you can probably just replace the Command attribute with the entire xcopy/robocopy command (or even the name of a batch file) you want to use, so you won't need to set up the ItemGroup etc.

Related

How to list all files from the build context that affect the contents of the image?

Is it possible to list all files that get copied into the image from the build context, or affect the final contents of the image in any other way?
I need this for dependency tracking; I am sculpting a build system for a project that involves building multiple images and running containers from them in the local dev environment. I need this to be optimized for rapid code-build-debug cycle, and therefore I need to be able to avoid invoking docker build as tightly as possible. Knowing the exact set of files in the build context that end up affecting the image will allow me to specify those as tracked dependencies for the build step that invokes docker build, and avoid unnecessary rebuilds.
I don't need to have this filelist generated in advance, though that is prefereable. If no tool exists to generate it in advance, but there is a way to obtain it from a built image, then that's OK too; the build tool I use is capable of recording dynamic dependencies discovered by a post-build step.
Things that I am acutely aware of, and I still make an informed decision that pursuing this avenue is worthwile:
I know that the number of dependencies thus tracked can be huge-ish. I believe the build tool can handle it.
I know that there are other kinds of dependencies for a docker image besides files in the build context. This is solved by also tracking those dependencies outside of docker build. Unlike files from the build context, those dependencies are either much fewer in number (i.e. files that the Dockerfile's RUN commands explicitly fetch from the internet), or the problem of obtaining an exhaustive list of such dependencies is already solved (e.g. dependencies obtained using a package manager like apt-get are modeled separately, and the installing RUNs are generated into the Dockerfile from the model).
Nothing is copied to the image unless you specifically say so. So, check your Dockerfile for COPY statements and you will know what files from the build context are added to the image.
Notice that, in the event you have a COPY . ., you might have a .dockerignore file in the build context with files you don't want to copy.
I don't think what's you're looking for would be useful even if it was possible. A list of all files in the previously built image wouldn't factor in new files, and it would be difficult to differentiate new files that affect the build from new files that would be ignored.
It's possible that you could parse the Dockerfile, extract every COPY and ADD command, run the current files through a hashing process to identify if they changed from the hash in the image history (you would need to match docker's hashing algorithm which includes details like file ownership and permissions), and then when that hash doesn't match you would know the build needs to run again. You could look at creating a custom buildkit syntax parser, or reuse the low level buildkit code to build your own context processor.
But before you spend too much time trying to implement the above code, realize that it already exists, as docker build. Rather than trying to avoid running a build, I'd focus on getting the build to utilize the build cache so new builds skip all unchanged steps, possibly generating the exact same image id.

TFS 2010 Build, constant drop location, random access issue

We are using TFS 2010 Build to deliver libraries on a fixed location. ( \\server\product-R0\latest )
Other team projects reference the library from this location.
On my build process I check if Build and unit tests passed, if it's ok I:
Transform web/app.config
Delete the latest folder using a "DeleteDirectory" activity
Create the latest folder using a "CreateDirectory" activity
Copy the binaries in the folder using "CopyDirectory" activity
I delete the folder first because if we rename an assembly the old one won't be deleted.
The issue is random and happen 40% of the time:
TF270002 : An error occurred copying files from
'D:\Builds\1\FooTeam\BarService\Binaries' to
'\\nas\Builds\BarService-R0\Latest'.
Details : Access to the path
'\\nas\Builds\BarService-R0\Latest\SomeFile.dll'
is denied.
If you launch the build several times it work.
I've try the usual dumb idea of "putting sleeps between steps to see what happens" but it don't solve the problem, it just seems to reduce the probability of it happening.
It's like TFS try to copy while still deleting the directory, some times it hangs on the directory creation step.
Anyone? Thank you!
The most elegant solution is to create a link instead of copying, something like
mklink /J D:\Drops\MyBuild_LatestGood D:\Drops\MyBuild_2014-06-13
Plus: No copy involved, same ACLs.
Caveats: this command works only locally, when the Drop share is located on the Build server. There are options also in the case of a NAS, as long as you are allowed to execute remote commands (e.g. SSH).
Another option is to create a network share on the desired folder, even if the disk is remote, as long as it reside on a Windows server.

Copy files to another folder during check in (TFS Preview)

I have the following scenario: The company edits aspx/xml/xslt files and copy manually to the servers in order to publish them. So, no build is done. For the sake of control we've decided to adopt TFS Preview since it tracks the version, who edited and so on. Needless to say, it works like a charm. :)
The problem is that since we are unable to build the apps we can't set a build definition to automate the copy of files to another place which, as I've stated before, is done manually.
My question is: Is it possible to copy the files to another place (a folder in a server or local) during the check in? If so, how? (remember, we don't build. so we can't customize the build process...)
You have two options.
1) Create a custom check in policy. I'm not familiar with this process enough to give you any pointers, but I believe it can be done.
2) Create a custom build template, and use that for your builds. You should be able to wipe the build template down to nothing, and then only add the copy operation to it. This is probably the route I would take. Get started here.
You mention you are using TFSPreview, which is hosted on the cloud so it won't be able to access any machines in your network unless you're prepared to open up your firewalls :).
You can copy source controlled files around the TFS Instance ([say into a Source Controlled Drop F1) and then check this out after the build completes.
Start by familiarising yourself with customising the TFS Build Process.
When you're up to speed, you need to look at adding a "Copy" Activity in the Workflow to move the files to the drop folder.

Is there any simple automated way of finding out all the source files associated with a Delphi project?

I like to backup up the source code set for a project when I release a version. I use GExperts project backups, which seems to gather up all the files in the project manager into the ZIP file. You can also add arbitrary files to this file set, but I'm always conscious of the fact that I haven't necessarily got all the files. Unless I specifically go though the uses clauses and add all the units I have sources for to the project, I'll never be sure of storing all the files necessary to recreate the installable/executable.
I've thought about rolling an app to traverse a project, following all the units used and looking down all the search paths and seeing if there is a source file available for that unit, and building a list of files to back up that way, but hey - maybe someone has already done the work?
You should (highly recommend) look into Version Control.
e.g. SVN (subversion), CVS
This will allow you to control revisions of all of your source. It will allow you to add or remove source files, roll back merge and all other nice things related to managing project sources.
This WILL save your a$%# one day.
You can interpret your question in two ways:
How can I make sure that I backup at least enough files so I can build the project
How can I make sure that I backup not too many files so I can still build the project
The first is to make sure you can build the system at all, the second to allow you to clean up unused files.
For both, a version control system including a separate build system is the way to go.
You then - for each new set of changes - can use these steps to assure that both conditions hold:
On your daily development system, check in the new revision of your source code into your version control system.
On your separate build system, get the latest version of your source control system.
Build the project on the build system; if this fails, go to Step 1, and add the missing files to your version control system from your development system
Start removing (one-by-one) files from the project that you suspect are not needed, then rebuild until it fails.
When the build fails, restore that particular file from the version control system, then continue step 3 with the next candidate
When the build succeed you have the minimum set of files.
Now make a difference overview of the files in your version control system, and the build machine.
Mark the files that are in your version control system but not on your build machine as deprecated or deleted.
Most version control systems have good ways of generating a difference between the files on your development or build system against the files in the version control system (usually fine grained for each historic point in time you added/removed/updated files in your version control system).
The reason you want a separate build system (or two separate development systems) is that you want them to be independent: you use one for developing, and the other for checking if the build is still OK.
This is the first step that in the future you might want to extend this into a continuous integration system (that runs unit tests, automatically creates product setups and much more).
--jeroen
I'm not sure if you're asking about version control or how to be sure you've got all the files.
One useful utility I run occasionally is a program that makes a DirList of all of the files in my dcu output folder. Changing the extensions from .dcu to .pas gives me a list of all of the source code files.
Of course it misses .inc files and other non-.pas files, but perhaps this line of thinking would be helpful to you in some way?
The value of this utility to me is that a second housekeeping utility program then makes a list of all .pas files in my source tree that do not have corresponding .dcu files. This (after a full compile of all programs) generally reveals some "junk" .pas files that are no longer in use.
For getting a list of all units compiled into an executable, you could let the compiler generate a MAP file. This file will contain entries for all the units used.

Team Build: Publish locally using MSDeploy

I'm just getting started with the team build functionality and I'm finding the sheer amount of things required to do something pretty simple a bit overwhelming. My setup at the moment is a solution with a web app, an assembly app and a test app. The web app has a PublishProfile set up which publishes via the filesystem.
I have a TFS build definition set up which currently builds the entire solution nightly and drops it onto a network share as a backup of old builds. All I want to do now is have the PublishProfile I've already setup publish the web app for me. I'm sure this is really simple but I've been playing with MSBuild commands for a full day now with no luck. Help!
Unfortunately sharing of the Publish Profile is not supported or implemented in MSBuild. The logic to publish from the profile is contained in VS itself. Fortunately the profile doesn't contain much information so there are ways to achieve what you are looking for. Our targets do not specifically support the exact same steps as followed by the publish dialog, but to achieve the same result from team build you have two choices, I will outline both here.
When you setup your Team Build definition in order to deploy you need to pass in some values for the MSBuild Arguments for the build process. See image below where I have highlighted this.
Option 1:
Pass in the following arguments:
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
Let me explain these parameters a bit, show you the result then explain the next option.
DeployOnBuild=true:This tells the project to execute the target(s) defined in the DeployTarget property.
DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder: This specifies the DeployTarget target.
PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish": This specifies the location where the package files will be written. This is the location where the files are written before they are packaged.
AutoParameterizationWebConfigConnectionStrings=false: This tells the Web Publishing Pipeline (WPP) to not parameterize the connection strings in the web.config file. If you do not specify this then your connection string values will be replaced with placeholders like $(ReplacableToken_dummyConStr-Web.config Connection String_0)
After you do this you can kick off a build then inside of the PackageTempRootDir location you will find a PackageTmp folder and this contains the content that you are looking for.
Option 2:
So for the previous option you probably noticed that it creates a folder named PackageTmp and if you do not want that then you can use the following options instead.
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;_PackageTempDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
The difference here is that instead of PackageTempRootDir you would pass in _PackageTempDir. The reason why I don't suggest that to begin with is because MSBuild properties that start with _ signify that the property in essentially "internal" in the sense that in a future version it may mean something else or not exist at all. So use at your own risk.
Option 3
With all that said, you could just use the build to package your web. If you want to do this then use the following arguments.
/p:DeployOnBuild=true;DeployTarget=Package
When you do this in the drop folder for your build you will find the _PublishedWebsites folder as you normally would, then inside of that there will be a folder {ProjectName}_Package where {ProjectName} is the name of the project. This folder will contain the package, the .cmd file, the parameters file and a couple others. You can use these files to deploy your web.
I hope that wasn't information over load.
The ability to publish web sites, configure IIS and push schema changes for the DEV->QA->RELEASE cycle has required either custom configuration to imitate publish or custom code where IIS settings are involved.
As of Visual Studio 2013.2 Microsoft has added a third party product that manages deployment of web sites, configuration changes and database deployment with windows workflow and would be the recommended solution for automating deployment from TFS build.
More information can be found here:
http://www.visualstudio.com/en-us/explore/release-management-vs.aspx
You can use the Publish/Deploy in Visual Studio 2010.
See http://www.ewaldhofman.nl/post/2010/04/12/Auto-deployment-of-my-web-application-with-Team-Build-2010-to-add-Interactive-Testing.aspx for more information

Resources