Is there any simple automated way of finding out all the source files associated with a Delphi project? - delphi

I like to backup up the source code set for a project when I release a version. I use GExperts project backups, which seems to gather up all the files in the project manager into the ZIP file. You can also add arbitrary files to this file set, but I'm always conscious of the fact that I haven't necessarily got all the files. Unless I specifically go though the uses clauses and add all the units I have sources for to the project, I'll never be sure of storing all the files necessary to recreate the installable/executable.
I've thought about rolling an app to traverse a project, following all the units used and looking down all the search paths and seeing if there is a source file available for that unit, and building a list of files to back up that way, but hey - maybe someone has already done the work?

You should (highly recommend) look into Version Control.
e.g. SVN (subversion), CVS
This will allow you to control revisions of all of your source. It will allow you to add or remove source files, roll back merge and all other nice things related to managing project sources.
This WILL save your a$%# one day.

You can interpret your question in two ways:
How can I make sure that I backup at least enough files so I can build the project
How can I make sure that I backup not too many files so I can still build the project
The first is to make sure you can build the system at all, the second to allow you to clean up unused files.
For both, a version control system including a separate build system is the way to go.
You then - for each new set of changes - can use these steps to assure that both conditions hold:
On your daily development system, check in the new revision of your source code into your version control system.
On your separate build system, get the latest version of your source control system.
Build the project on the build system; if this fails, go to Step 1, and add the missing files to your version control system from your development system
Start removing (one-by-one) files from the project that you suspect are not needed, then rebuild until it fails.
When the build fails, restore that particular file from the version control system, then continue step 3 with the next candidate
When the build succeed you have the minimum set of files.
Now make a difference overview of the files in your version control system, and the build machine.
Mark the files that are in your version control system but not on your build machine as deprecated or deleted.
Most version control systems have good ways of generating a difference between the files on your development or build system against the files in the version control system (usually fine grained for each historic point in time you added/removed/updated files in your version control system).
The reason you want a separate build system (or two separate development systems) is that you want them to be independent: you use one for developing, and the other for checking if the build is still OK.
This is the first step that in the future you might want to extend this into a continuous integration system (that runs unit tests, automatically creates product setups and much more).
--jeroen

I'm not sure if you're asking about version control or how to be sure you've got all the files.
One useful utility I run occasionally is a program that makes a DirList of all of the files in my dcu output folder. Changing the extensions from .dcu to .pas gives me a list of all of the source code files.
Of course it misses .inc files and other non-.pas files, but perhaps this line of thinking would be helpful to you in some way?
The value of this utility to me is that a second housekeeping utility program then makes a list of all .pas files in my source tree that do not have corresponding .dcu files. This (after a full compile of all programs) generally reveals some "junk" .pas files that are no longer in use.

For getting a list of all units compiled into an executable, you could let the compiler generate a MAP file. This file will contain entries for all the units used.

Related

What do all the options on GetOptions mean?

The MSDN documentation lists four options, with limited explanation:
Overwrite "Overwrite existing writable files if they conflict with the downloaded files." Does this apply to all files, or just ones we've told TFS we've edited?
GetAll "Gets all files." What files does TFS not normally get?
Preview "Executes a get without modifying the disk." This one seems pretty clear.
Remap "Remaps existing items on the disk to the server items where the content and disk location are not changing." I have no idea what this means.
Overwrite: will blindly overwrite writable files that you have not pended for edit. If you have marked a file as 'writable' then you have violated the contract with TFS and it assumes that you have done this for a good reason (eg, modifying the file without taking a checkout, because you were working offline). This will generally produce a writable conflict on the file, but if you specify this flag, then the writable file will be overwritten.
This only applies to server workspaces (local workspaces are always writable). This has no effect on files that you have pended for edit. Get will always produce conflicts for files that are edited locally and updated on the server; if you want to update files that are checked out, you must undo the checkout (or resolve the conflict with TakeTheirs).
Get All: will download every file and update it, even if TFS believes that the local version is the same as the remote version and that downloading a new version would be a noop. TFS tracks every version that you have locally, as well as remotely, so this is only useful if you edit files locally without checking them out.
If you have kept them writable, then then - as mentioned above - this will be a writable conflict. If you have then marked them read-only then TFS assumes that you have not made any changes and will not bother updating them when you do a get (because it knows the file contents haven't changed). If you have manually changed the file contents, then marking this will update those files to the server version.
Preview: will just fire events and provide results that indicate what would be downloaded with the given parameters.
Remap: is a clever option that allows you to perform an in-place branch switching (which is very common with some version control systems that branch at the repository level - like Git - but somewhat complicated in TFVC.)
Consider that you have mapped $/Foo/main to C:\Foo, and done a get latest. If you update your working folder mappings so that $/Foo/branches/feature now points to C:\Foo, then issue a get with Remap, then the server will download only the changed files between main and branches/feature, so it's an inexpensive way to update your local workspace to a feature branch.
(If you're looking for an example, this functionality exists in the command-line interface and in Team Explorer Everywhere but not in Visual Studio.)

TFS MSBuild Copy Files from Network Location Into Build Directory

We are using TFS to build our solutions. We have some help files that we don't include in our projects as we don't want to grant our document writer access to the source. These files are placed in a folder on our network.
When the build kicks off we want the process to grab the files from the network location and place them into a help folder that is part of source.
I have found an activity in the xaml for the build process called CopyDirectory. I think this may work but I'm not sure what values to place into the Destination and Source properties. After each successful build the build is copied out to a network location. We want to copy the files from one network location into the new build directory.
I may be approaching this the wrong way, but any help would be much appreciated.
Thanks.
First, you might want to consider your documentation author placing his documents in TFS. You can give him access to a separate folder or project without granting access to your source code. The advantages of this are:
Everything is in source control. Files dropped in a network folder are easily misplaced or corrupted, and you have no history of changes to them. The ideal for any project is that everything related to the project is captured in source control so you can lift out a complete historical version whenever one is needed.
You can map the documentation to a different local folder on your build server such that simply executing the "get" of the source code automatically copies the documentation exactly where it's needed.
The disadvantage is that you may need an extra CAL for him to be able to do this.
Another (more laborious) approach is to let him save to the network location, and have a developer check the new files into TFS periodically. If the docs aren't updated often this may be an acceptable compromise.
However, if you wish to copy the docs from the net during your build, you can use one of the MSBuild Copy commands (as you are already aware), or you can use Exec. The copy commands are more complicated to use because they often populated with filename lists that are generated from outputs of other build targets, and are usually used with solution-relative pathnames. But if you're happy with DOS commands (xcopy/robocopy), then you may find it much easier just to use Exec to run an xcopy/robocopy command. You can then "develop" and test the xcopy command outside the MSBuild environment and then just paste it into the MSBuild script with confidence that it will work - much easier than trialling copy settings as part of your full build process.
Exec is documented here. The example shows pretty well how to do what you want, but in your case you can probably just replace the Command attribute with the entire xcopy/robocopy command (or even the name of a batch file) you want to use, so you won't need to set up the ItemGroup etc.

TFS Build Queries for Non .NET supported files

We are using TFS for maintain file versions of our database.
We do not have any .NET application in our Source Control; only HTML and supported CSS files only.
Is it possible through TFS Build Automation process to create Zip package and Deploy the package to drop location?
Note: We Do not have any .NET project or solutions only need to deploy folder(with HTML and supported files) in zip format.
I somewhere read that TFS build definition compulsorily needs .SLN files to have build project.
We don't want to build anything or test anything
Just want to create zip and deploy same to drop location.
I have tried some tweaking of Build Definition.
But in New Build definition in process tab it asks me for Items to build, where my selection is restricted to .NET supported files only.
You will need to create a custom MSBuild project file (.proj) to perform the work that you need. You can test this file locally in the command line and then when it is ready, you can point the Build process at it.
This is a good starting point for you http://www.developerfusion.com/article/84411/customising-your-build-process-with-msbuild/
The MSBuild Community Tasks (https://github.com/loresoft/msbuildtasks) contains a Zip task which should make the job a lot easier.
Judging by your description, you are using 2010 or later. What I would do is to create a custom build template that does all of what you are looking for. If you start with default template obviously you would want to remove all of the compile and test activities and replace it with the zip and copy it to the binaries directory (From there it will be moved to the drop). You could do 1 of two things for the solution file requirement, create a fake solution file in the workspace and use that knowing it won't be compiled. Or you could, in the template, remove the Argument BuildSettings which is the object that contains the solution file and configurations.

Using wixlibs from another solution with TFS builds

We have installers referencing a wixlib file to get some common functionality. The wixlib is built in another solution then moved to a folder within that solution. When we try to build the installers with a TFS build, we get an error from light.exe:
light.exe: The system cannot find the file '..\..\..\Core\Common\assemblies\v1.0\Common.Wix.wixlib' with type 'Source'.
Our regular projects can reference \assembiles\v1.0, since we have some other common assemblies stored there. How do we get WiX to recognize this location during build?
You are referencing wixlib directly. So as far as I understand the TFS build process, it should be added to TFS project of your solution. TFS project shouldn't be dependent on the output of another non-dependent solution. It is at least bad practice. And in any case you can't guarantee this output would be generated before your project build on server.
As far as I remember, TFS build creates separate folder for each build and gets sources there. So your solutions are no longer on the same folder hierarchy level.
One more point in favor of explicitly copying wixlibs into your installer project: versioning - in this case any bugs made in the common library will not immediately break all projects that reference it. And you can gradually upgrade and test every project. Can you imagine auomatically downloading new version of any 3rd party dll on every build? Any change in that dll will immediately break your application even if changes are not critical to you.
Since the shared component and the active solution are in two separate projects in TFS, the workspace must be setup so that the relative paths for references remain intact. The easiest way to do this is to set your Build Agent Folder structure in Workspace in your TFS Build to have $(SourceDir) represent your root. However, don't change your Source Control Folder - that stays the same.
For example, say you have the following structure:
-TFS
|-SharedComponents
||-MyComponents
|-ProjectArea
||-MyProject
You would want to have the following two items in the build Workspace:
Source Control Folder Build Agent Folder
---------------------------------------------------------------------------------
$/ProjectArea/MyProject $(SourceDir)\ProjectArea\MyProject
$/SharedComponents/MyComponents $(SourceDir)\SharedComponents\MyComponents
This mimics the structure in TFS in your build folder, thus allowing all relative paths to remain intact.
One more note about this configuration: Since you have the shared components in another location, you may want to create a solution folder in MyProject and add the components that you are using to it. This will ensure they get pulled automatically when anyone loads your project from TFS - they won't have to go back and pull down the share components folder separately after discovering a build error.

Using WiX to generate an installer for an ASP.Net MVC website

Has anyone used WiX to generate an installer for an ASP.Net MVC website? Do you harvest files from the web project? I can’t find any good examples of this being done. There doesn’t seem to be a documented way to include all the right files, only the right files and put them in the right place.
If you add the website project as a reference in the installer project, and set harvest=True in the properties, then all the website files are captured, but there are issues:
Some files that should not be copied are included, e.g. packages.config, Web.Debug.config There doesn’t seem to be any clear or simple way to exclude them (as per this discussion).
The .website dll file is in the wrong place, in the root rather than the bin folder (as per this discussion)
However if you do not use harvesting, you have a lot of files to reference manually (e.g. Under \Content\ alone I have 58 files in 5 folders. Most of that is jQuery UI) and they change from time to time, and errors and omissions could easily be missed from a WiX file list. So it really should be kept in sync automatically.
I disagree with the idea that the list of files should be specified explicitly in WiX and not generated dynamically (which is what seems to be suggested at the first link, the wording isn't very clear). If I need to remove a file I will remove if from the source control system, there is no need to do the extra work of maintaining two parallel but different catalogues – one set of files in source control, and the same files listed in WiX. there should be one version of the truth. All files in the website's source tree (with certain known exceptions that are not used at runtime e.g. packages.config) should be included in the deployment.
For corporate reasons I don't have much choice about using WiX for this project
In our MVC 3 project we use Paraffin to harvest files for the installer. For example, you can use "-ext " to ignore the files with extension , use "regExExclude " to ignore the file name matching the regular expression, etc.
Paraffin also keeps the proper structure, all your files would be in the correct folder as they appear in your project.
I use a program that I wrote called ISWIX that makes authoring wxs merge modules a simple drag and drop operation like InstallShield. I then consume that merge module in an installer that handles the UI and IIS configuration.
I also have postbuild automation that extracts the content of the MSI and compares it against what the project published. If there is a delta I fail the build and you have to either a) add it to the wxs or b) remove it from the publish.
I find that the file count churn from build to build is minimal and that this system is not difficult to maintain. The upside is everything remains 100% intentionally authored and files don't ever magically add or remove from the installer unless you intended them to. Dynamic installer generation isn't worth the risk and most people who argue that it is don't even know what those risks are.

Resources