How to force prebuild script to run at each compile - delphi

We currently use Delphi 2009 and GIT to develop an application. We have set up a prebuild script to generate a version number and build ID using information from git and compile this as a resource that is included in the project. The problem is that this script doesn't run on a regular compile. This means that the other developers can end up with a discrepancy between the actual version number and the number in the resource (especially when switching branches in git).
Since we use our software to make some critical calculations, we would like to use this build id to reproduce calculations and track down problems.
Other than trying to force my developers to frequently press shift-F9, how can I ensure that the the prebuild script gets run when necessary (ideally at each compile)?
Jason
UPDATE: It's true that the pre-build script gets run at each compile. The problem was that I expected to get a different result pressing F9 after making a tag in git, even though no code had changed.

We solved a similar issue by writing a custom IDE plugin which uses the IOTAIDENotifier50 interface. Specifically using the BeforeCompile method, to test some required project settings and also generate dynamic version information (VERSIONINFO resource). It gets called for every type of build (compile and build). We also generate a unique exe serial number and log everything, which helps us track down issues and is similar to your script. For completeness we have only done this in Delphi 2007 and Delphi XE.

Pre build actions do run before every compile.
You state in a comment that the actions sometimes don't run when you press F9. That makes sense because F9, or Run, only invokes a compile if source is deemed to have changed.
A BeforeCompile notifier plug in will behave in exactly the same way. Your solution is to make sure that you compile before running using Ctrl+F9.

Related

TFS Build custom activity requiring more assemblies than needed

I've just written the first version of a workflow activity that will run Resharper's Code Issues on the projects and parse the output to display the issues as build warnings and errors.
At first, I was going to just call Resharper's command line and parse the resulting xml manually. After fiddling with the dlls in Resharper's SDK (through disassembly mostly), I found a way to parse the results using it's own public classes, which I figured was a much more elegant and safe way to do this.
The first problem I have is that that nuget package is absolutely huge. There is 140mb of files in there, which to me is absurd for a single, unpartitioned package. There seems to be such heavy coupling between them that by using just a few model classes and the parser class, I have to drag a dozen or so of those dlls along, some of them which seemingly have nothing to do with the main dlls I need. This is not a show stopper though, I'm struggling with something else now:
In the end, I managed to track down the dependencies I needed to 41 assemblies (which is, again, insane, but alas). Initially, I tried removing everything and adding the missing references one by one, but this turned out to be unreliable, still missing some indirect references, even after compiling successfully. Then, I decided to code a small console application to find all referenced assemblies in the main Resharper assemblies I used, which gave me the 41 references I mentioned. This is the code I used to find every dependency.
Since these are custom activities we are talking about, I decided to create a unit test project to validate them. Using these 41 references only, everything works correctly.
When I added the activity to the build workflow though, and pointed the build controller to the source control folder containing the required assemblies, every time I schedule a build, the process fails stating that I need one extra dll from Resharper's SDK. For example, this is the first one it asks:
Could not load file or assembly 'AsyncBridge.Net35, PublicKeyToken=b3b1c0202c0d6a87' or one of its dependencies. The system cannot find the file specified. (type FileNotFoundException)
When I add this specific assembly to the TFS folder, I get another similar error for another dll, and this keeps going on and on.
What I wanted to know is how can I know exactly which assemblies a workflow XAML will need in order to run correctly? My custom activity dll has two specific CodeActivities and a XAML only activity that uses these two. This XAML acticity is what I'm directly using in the modified workflow template.
I see that besides the references in my project, the XAML activity also contains a TextExpression.ReferencesForImplementation section, with some assembly names. I've run my dependency finder program on those dependencies too, and the results are the same 41 assemblies already at the TFS folder.
Meanwhile I'll go with having the whole SDK into the custom assemblies folder, but I would really like to avoid this in the future since it has such an enormous amount of unneeded and big dlls in there.
First, we have request for our command line tool to support workflow activity and we decided to implement just plain MsBuild task which is universal and works in TFS too. Task and targets files are included in ReSharper CLT 8.2.
Second, if you still want to implement workflow activity it's pretty easy to do with new API in CLT, designed specially for custom processing of found issues - http://confluence.jetbrains.com/display/NETCOM/Custom+InspectCode+Issue+Logger.
And last, but not least, you do not need to put in VCS binaries of ReSharper SDK package.
Use NuGet's restore package functionality.
If you have any other questions I'll be glad to answer them.
A custom activity is being load and run by .NET CLR like any other .NET program. If the stack trace reports a missing file, then it's required by the CLR and you can't change this fact without refactoring your code.
Having an entire SDK references in the custom assembly folder doesn't make sense. I would prefer GAC deployment over huge binaries folder in the source control. Or maybe consider having these activities running an pre\post build scripts in MSBuild or PowerShell.

How to distinguish between compile and build with D2007?

I have successfully registered an IDE notifier (IOTAIDENotifier80) so I get AfterCompile notifications.
Is it possible to find out whether the project was built versus just being compiled/made?
I've found this answer about implementing a IOTAProjectCompileNotifier but this is not available in D2007.
Any other way? I'd be fine with an undocumented way as this is for an inhouse expert only.
Update: I need this to replicate the "AutoInc build number" feature with an external .rc file containing the version info resource. Maybe this can be done via BuildEvents? Although I like the ability to log a message in the IDE showing the updated version number...
I have no idea about writing experts or hooking into the IDE, and I'm scared of trampolines. Having said that, I've noticed that you can tell the difference between a compile and build by monitoring the timestamps of the files in your project output folder, but this difference is only evident if no code (*.pas or *.dfm) has changed since last compile or build. In other words, when there is a code change, you can NOT tell the difference by monitoring the timestamps. However, when there is no code change since last compile or build, a compile will only change the timestamp of the exe (the dcu timestamps are not changed).
Therefore, in the absence of other more elegant solutions, and only if you are really desperate for this information (ie, was it compiled or was it built?) then I can suggest a 2-part solution for you.
Part 1. Write a process to monitor changes in timestamps in your output folder, and
Part 2. Tell your fellow developers there is a bug in D2007 which complicates your build process, but that this bug is easily overcome by simply compiling twice or building twice (or if you know how to automate this then go for your life). If you can get your developers to compile twice or build twice then upon the second compile or build you will be able to deduce whether it was a compile or build by testing if the timestamp of the dcu changed upon the second compile or build.
Now, I will go and stand in the naughty corner and ask myself "why me?". Cheers.

Best practices for using SVN with Delphi Visual Component packages?

With the desire to be able to reproduce a given revision of a project that is utilizing 3rd party visual component packages, what goes in SVN and what's the best way to implement/structure the SVN repos?
For non-visual components, the rule seems simple to ensure no reliance on outside repos - "no svn-externals reference to any outside repo allowed". I have a shared repo that I control, which is the only 'svn-externals' reference allowed. This makes it easy to implement and share these types of runtime itemss with sourcecode in different SVN projects. Any reference this internal shared repo is by 'svn-externals' using a specific revision number.
Visual packages seem to go counter to being able to be version controlled easily as they may have to be reinstalled at each revision. How to best create a SVN project which is able to be recreated later at a specific revision number...is there a recommended solution?
Previously we didn't worry about 3rd party components as they don't change often and we never had a real good solution. I was wondering if others have figure out the best way to handle this problem as I'm doing a spring cleaning/internal reorganization and wanted to do it 'better' than before.
Technically, the RTL/VCL source should also be in the SVN repo as well (if there's a Delphi hotfix/service pack released.)
My solution will likely be to create a virtual machine with a particular release of the Delphi environment with all visual controls installed. As we add/update visual controls, or update Delphi with hotfixes/service packs then we create a new version of the virtual machine. We then store an image of this VM revision on a shelf somewhere. Is this what you do? Does the Delphi activation/licensing work well (or at all) in this scenario?
Thanks,
Darian
You can prepare "start IDE" (and possibly "build") scripts for your projects and maintain them as project evolves in repository.
Regardless of your decision about keeping components in separate repositories and using externals, or including them in a single repository with possible branching, you should also include compiled bpl files for every component build and for every branch prepared for a specific Delphi version.
You should definitely try to keep most (if not all) of paths relative, in a worst case use environment variables to point to your root project dir.
Start IDE script allows you to keep each project and Delphi version environment spearately configured on a single Windows installation.
It should include necessary registry keys for your project and Delphi:
Windows Registry Editor Version 5.00
[-${DelphiRegKey}\Disabled Packages]
[-${DelphiRegKey}\Known Packages]
[-${DelphiRegKey}\Library]
[${DelphiRegKey}\Known Packages]
"$(BDS)\\Bin\\dclstd${CompilerVersion}.bpl"="Borland Standard Components"
"$(BDS)\\Bin\\dclie${CompilerVersion}.bpl"="Internet Explorer Components"
"$(BDS)\\Bin\\dcldb${CompilerVersion}.bpl"="Borland Database Components"
(...)
"${CustomComponentPack}"="Custom Components"
[${DelphiRegKey}\Library]
"Search Path"="${YourLibrarySourceFolder1};${YourLibrarySourceFolder2}"
(...)
You can then prepare batch file:
regedit /s project.reg
%DelphiPath%\bin\bds -rProjectRegKey Project.dpr
Where ${DelphiRegKey} is HKEY_CURRENT_USER\Software\Borland(or CodeGear in newer versions)\ProjectRegKey.
Basically it is easier when you will dump your current working configuration from registry, strip it from unnecessary keys, change paths to relative and then adapt to make it work with your project.
In such configuration, switching between projects and their branches which have different sets of components (and/or possibly using different Delphi version) is a matter of checking out a repository only and running the script.
Fortunately for us, we don't have to worry about a hotfix/service pack; we're still on Delphi 5. :D
Sigh, there was a time when an entire application (settings and all) would exist within a single directory - making this a non-issue. But, the world has moved on, and we have various parts of an application scattered all over the place:
registry
Windows\System
Program Files
Sometimes even User folders in "Application Data" or "Local Settings"
You are quite right to consider the impact of hotfixes/service packs. It's not only RTL/VCL that could be affected, but the compiler itself could have been slightly changed. Note also that running on the same line of thought, even when you upgrade Delphi versions, you need to build using the correct version. Admittedly this is a little easier because you can run different Delphi versions alongside each other.
However, I'm going to advise that it's probably not worth going to too much effort. Remember, working on old versions is always more expensive than working on the current version.
Ideally you want all your dev to be be on main branch code, you want to minimise patch-work on older versions.
So strive to keep the majority of your users on the latest version as much as possible.
Admittedly this isn't always possible.
You wouldn't want to jump over to the 'new version' without some testing first in any case.
Certain agile processes do tend to make this easier.
By using a separate build machine or VM, you already have a measure of control.
TIP: I would also suggest that the build process automtically copy build output to a different machine, or at least a different hard-drive.
Once you're satisfied with the service pack, you can plan when you want to roll it to your build machine.
It is extremely important to keep record of the label at which the build configuration changed. (Just in case.)
If your build scripts are also kept in source control, this happens implicitly.
When you've rolled out the hotfix/service pack, fixes to older versions should be actively discouraged.
Of course, they probably can't be eliminated, but if it's rare enough, then even manual reconfiguration could be feasible.
Instead of a VM option to keep your old configuration, you can also consider drive-imaging.
To save on the $$$ of VMWare LabManager, look for a command-line driven VM Player.
You might have to keep 2 "live" machines/VMs, but should never need more than that.
It's okay for an automatic build script to fail because the desired configuration isn't available. This will remind you to set it up manually.
Remember, working on old versions is always more expensive than working on the current version.
Third Party Packages
We went to a little bit more effort here. One of our main motivations though was the fact that we use about 8 third party packages. So doing something to standardise this in itslef made sense. We also decided running 8 installation programs was a PITA, so we devised an easy way to manually install all required packages from source-control.
Key Considerations
The build environment doesn't need any packages installed, provided the object and/or source files are accessible.
It would help if developers could fairly easily ensure they're building with the same version of third party libraries when necessary.
However, dev environments usually must install packages into the IDE.
This can sometimes cause problems with source compatibility.
For example new properties that get written to IDE maintained files.
Which of course brings us back to the second point.
Since Third Party packages are infrequently updated, they are placed within a slightly different area of source-control.
But, NB must still be referenced via relative paths.
We created the following folder structure:
...\ThirdParty\_DesignTimePackages //The actual package files only are copied here
...\ThirdParty\_RunTimePackages //As above, for any packages "required" by those above
...\ThirdParty\Suite1
...\ThirdParty\Suite2
...\ThirdParty\Suite3
As a result of this it's quite easy to configure a new environment:
Get latest version of all ThirdParty files.
Add _DesignTimePackages and _RunTimePackages to Windows Path
Open Delphi
Select Install Components
Select all packages from _DesignTimePackages.
Done!
Edit: Darian was concerned about the possibility of errors when switching switching versions of Design Packages. However, this approach avoids those kinds of problems.
By adding _DesignTimePackages and _RunTimePackages to the Windows Path, Delphi will always find required packages in the same place.
As a result, you're less likely to encounter the 'package nightmare' of incompatible versions.
Of course, if you do something silly like rebuild some of your packages and check-in the new version, you can expect problems - no matter what approach you follow.
I usually structure my repository in SVN like this:
/trunk/app1
/trunk/comp/thirdparty1
/trunk/comp/thirdparty2
/trunk/comp/thirdparty3...
I have, right in the root folder (trunk) a project group (.groupproj, or .bpg on old delphi) that contains all my components. (allcomponents.groupproj).
Installing on a new machine, means opening that package, and installing the designtime components. That's a drag on all versions of Delphi older than 2010, but 2010 and XE have a lovely feature so you can see at a glance, which components are designtime components.
I also, sometimes, will save myself the trouble of installing those components by hand, by making a build.bat file, and a regcomponents.bat file. The regcomponents just runs regedit , and imports the keys needed to register all those components, after build.bat has built them, and everything else.
When you move up from one delphi version to another, it's sure good to have both a batch and reg file, and a group project, to help you. Especially if you have to go through and do a lot of opening of project/packages and saving them as MyComponent3.dpk instead of MyComponent2.dpk, or updating the package extension from 150 to 160, or whatever your packages do.

How do I run all projects in a project group?

I have a project group containing two projects that share one source folder,but do different things.What I find strange is the use of 'project group',but I don't want to turn this a subjective question,thereby I directly ask you:
How do I run all the projects in a project group - is there a short way?
Thank you in advance!
Actually, all of you are wrong. You can debug multiple programs at the same time.
I don't know when this was first implemented, most probably when the project groups were added to the Delphi, but I'm using this since "forewer" and I'm sure that at least Delphi 2005 was capable of doing it.
In short:
Create a project group with two programs.
Build them all! You won't be able to use the compiler after you start the debugger.
Activate the first program (double-click on its name in the Program Manager) and press F9 (run).
Activate the second program in the Program Manager (you cannot use the drop-down next to the "Run" toolbar button for that as it will become disabled in the previous step) and press F9.
Voila! You have two programs running under the debugger. You can set breakpoints in any of them and they will work just file.
This approach works with any number of programs. (There may be some hardcoded limitations but I've never run into them.)
The debugger can debug only one application (actually: Process) at the time, and if you run from the IDE it is in the debugger.
So I think the answer is : you can't.
Well, I guess unless you count dlls that are launched in the same process, but are individual projects. (seeing the other post), but I never tried that.
To run all the projects at once, add a new batch file to your project group. Make the batch file run each program, and when you want to run them all later, simply choose the batch file in the project group and run it. This isn't the same as debugging all the projects, just running them. It's simply a way to automate the procedure given in Bruce's answer.
You can only debug one project at a time, but you can run as many as you like from the IDE without debugging.
Shift + Ctrl+ F9
Update: I stand corrected. You can debug multiple projects at the same time. Excellent for debugging a client and a server at the same time.
The projectgroup is a tool to build multiple projects.
You can select build all from here to build them all.
You can run several from within the debugger if one is a program and the others are dlls that are used by the program.
Multiple programs's can't be run/debugged at the same time.
Although you can use a program as a DLL but I'm not sure if the IDE can handle that use. In that case you can use a main procedure which is the only thing called from the main program. Then you should export this main proc. Create an additional exe project that calls all of them from different threads. I have not tested this. And it is a hack, but it could probably work.
But why do you want to run/debug several apps at the same time?
As a side note: Check all dependent projects (right click on project -> Dependencies) to have them build automatically before debugging.

Ant automation of InstallAnywhere installers - console mode

My problem is to create an ant target for automating our installer running in console mode.
The installer is created using InstallAnywhere 2008, which UniversalExtractor recognizes as a 7-zip archive. Once I have the archive unpacked, it appears that the task can use an input file to drive the console (at the very least, it appears that emitting a quit shuts everything down correctly, and output is captured).
So it looks to me as though I have all of the pieces I need for proving out this idea except a clean way to perform-self-extraction-then-stop. Searching for a command-line argument to stop the auto execution has not produced a likely candidate, and the only suitable ant task I've found ( http://www.pharmasoft.be/7z/ ) isn't so clearly documented that I have a lot of confidence in it.
The completed completed is expected to work in Windows, Linux, and a small handful of other Unix environments.
What's the best practice to use here?
Since you control the installer creation, can you run the self-extraction step on your machine, package the results before the installer is launched in a ZIP file, etc. and use that instead of the single file executable? Not very elegant but it may work.
Also, I am a bit hesitant to blatantly promote my project :) but since it has been a while since you asked the question and nobody has answered, have you considered an alternative? Our project InstallBuilder allows you to install in unattended mode directly, without having to autoextract the contents. Just invoke the executable with --mode unattended, pass any additional options you may need from the command line or an external file and you are good to go. We have a lot of ex-InstallAnywhere customers :)

Resources