Could Free Pascal benefit of something like Apache Maven? - delphi

Apache Maven is a very popular build and dependency management tool in the Java open source ecosphere. I did some tests to find out if it can handle compiled Free Pascal / Delphi units and found it easy to implement. So it would be possible to
release open source libraries precompiled for Free Pascal (or Delphi) in a public Maven repository
include metadata in this repository which contains dependency information
use Maven on the command line to download the open source library from the public repository, and automatically resolve all dependencies
local repositories, working as proxies, could be used to cache frequently used binaries
automatic checksum generation and verification (provided by Maven) would reduce the risk of downloading corrupted binaries
source code and even documentation files could be provided with the binaries
binaries can be provided with or without debug information
continuous integration servers like Hudson, TeamCity or CruiseControl can be used to build projects whenever changes have been submitted to the source control system and notify developers about build errors
This way of dependency management could be very beneficial for open source projects which use many third party libraries with complex dependencies. It would avoid typical conflicts caused by using wrong versions.
For the developer, the workflow for editing and building a project would be reduced to a minimum:
checkout the project source from internal version control system
edit source file(s)
run mvn package to automatically download all required third party libraries (precompiled units) if they are not yet in the workstation's local repository
compile and run
The only additional file for Apache Maven which is required in the project folder is the POM.XML file containing the project information.
Edit: while Maven is usable for some of the required tasks, implementing a solution like Maven in native Free Pascal would have some advantages: no Java SDK required, support for all development platforms where Free Pascal is available, maintenance and plugin development in Pascal.
Usage of a Maven-like tool would not be helpful for open source projects only - commercial projects could access and use the artifacts in public Maven repositories in the same way as well.
Maven features are listed at http://maven.apache.org/maven-features.html
Update:
one use case could be the build of Lazarus, where Maven would download all required libraries and invoke the compiler with the necessary build path arguments. Changes in the dependencies on lower levels would be propagated automatically up to the parent build.
Possible benefits:
less time needed to set up a new work
station, no manual installation of
third party libraries required
less errors caused by wrong library
versions, detection of version
conflicts (for example if two
libraries depend on different
versions of a third library)
artifacts which are created inhouse
can be added to the local maven
repository and shared between
developers and project, central
storage of all artifacts with
metadata
builds are reproducible, just by
using the same source and project
metadata file (pom.xml)
can reduce development time and
increase project stability
Update #2: FPMake
the FPMake build system for Free Pascal seems to be a tool with much potential, in many details it is quite similar to Maven:
FPMake is a pascal based build system developed for and distributed with FPC
FPMake standardizes the building by defining some limits like standard directories
the command fppkg <packagename> will look in a database for the package, extract it, and then compile fpmake.pp and run it
it has standard build targets (clean, build, install, ...)
it can create a 'manifest' file suitable for import into a repository (like mvn deploy or mvn install), the manifest is an XML file which looks very similar to a pom.xml in Maven:
FPMake manifest file:
<packages>
<package name="my-package">
<version major="0" minor="7" micro="6" build="1"/>
<filename>my-package-0.7.6-1.zip</filename>
<author>my name</author>
<license>GPL</license>
<homepageurl>http://www.freepascal.org/</homepageurl>
<email>myname#freepascal.org</email>
<description>this is the package description</description>
<dependencies>
<dependency>
<package packagename="rtl"/>
</dependency>
</dependencies>
</package>
</packages>

Freepascal has been working on a package system of its own in a cross between apt-get and freebsd ports style. (download source/build/install automatically), called fppkg.
However work has stalled. People investing time are the bottleneck, not people wanting to choose tools.
As far as Maven goes, I don't like auxilary tools that need installation of huge external runtimes. It might be fine for a big major app (like Open Office), but not for an util.
I also prefer a tool that is designed to the FPC reality and workflow.
Documentation tools, build tools, download systems, testsuite systems are already all there, it just need a person that dedicates a lot of time into it to make it happen.
Some typical problems when introducing a new technology in a project as FPC, and why it has a tendency to make its own tools:
need to train 20+ committers in parttime.
The only COMMON programming language you can assume is Free Pascal. Even Delphi inner workings can't be taken for granted to be known (many committers came directly to FPC or even still via TP or a Mac Pascal)
Obviously that makes something with plugins in a different language annoying.
Bash script is a close second. (g)make third, but already a magnitude less.
All servers are *nix-like (FreeBSD, OS X, Linux), but not all run Apache. (e.g. my FreeBSD mirror runs XSHTTPD)
somebody most knowledgable must be dedicated maintainer for a long time. Fix problems, update/ do migrations etc. Perferably more than one for obvious reasons.
a major pain are Linux distributions (and FreeBSD to a lesser degree), most maintainers of *nix packages are not capable of more than "./configure;make;make install", and must be spoonfed with a near buildable repository and auxilary files.
In-distribution packaging of FPC/Lazarus has always been important, and is still increasing
All distributions have their own special rules about metadata, depedancies, and how sources must be published. Particularly Debian/Ubuntu is very bureaucratic and slow.
Most don't like third party auto-installers on top of their systems (since that bypasses their dependancy control)
This all leads to the effective practice that own tools in Pascal with minimal scripting work best. Some tools used:
Gmake is mainly used to parameterise the build process on a per directory level, a successor, fpcmake (not really a make derivative despite the name) has begun, but the migration hasn't completed.
Latex and a latex to html conversion (tex4ht, but debian uses hevea) are used in the documentation building (the non library documentation)
The community site (netscape community server which uses TCL scripting, a heavy complex application server) has been a trouble ever since it started, but specially lately since the maintainer became less active.
Mantis has been a problem (specially the email module would crash or lame the server due to the volume), but it has been whipped into shape during successive updates and hard work of several lazarus devels. Currently it is a decent workhorse.
lazarus.freepascal.org PHPBB forum OTOH is relatively painless since a lot of younger people know how to deal with it.
The same goes for subversions (though the more advanced scale needs some adjusting, not everybody is deep into the ins and outs of mergetracking)
If somebody was really serious about Maven, I usually would ask him:
to CRITICIALLY investigate the use for the project. In a very concrete way, with schedule and time estimates. Birds-eye level "everything's possible" overviews are essentialy worthless.
Give some thought on future change of used technologies. Every technology is eventually replaced, even the in-house ones, in 18 year+ projects. A new technology must not make migrations of other infrastructural components hard or involved. The new technology to end all new technologies doesn't exist.
Make a migration plan. Migration is often underrated and underestimated.
And in the end, there is always the 1000000 Euro question, who will do the daily maintenance?
Keep in mind that in a company you just kick the person responsible for the application server. But in an informal environment this is way harder, specially long term, since people's lives, occupations and time spent on the project vary.

Sounds like an interesting plan, but the Delphi community (and FPC even more so, I'd imagine!) values libraries as source far more than precompiled libraries. The general consensus is that anyone who uses a binary-only library is a fool, for two reasons: You can't fix any bugs you find in it, and compiler changes will break compatibility.

Related

Built-in code analysers vs NuGet packages

Having just switched to VS2019 I’m exploring whether to use code analysis. In the project properties, “code analysis” tab, there are numerous built-in Microsoft rule sets, and I can see the editor squiggles when my code violates one of these rules. I can customise these rule sets and “save as” to create my own.
I have also seen code analyser NuGet packages such as “Roslynator” and “StyleCop.Analyzers”. What’s the difference between these and the built-in MS rules? Is it really just down to more comprehensive sets of rules/more choice?
If I wanted to stick with the built-in MS rules, are there any limitations? E.g. will they still get run and be reported on during a TFS/Azure DevOps build?
What's the difference between legacy FxCop and FxCop analyzers?
Legacy FxCop runs post-build analysis on a compiled assembly. It runs as a separate executable called FxCopCmd.exe. FxCopCmd.exe loads the compiled assembly, runs code analysis, and then reports the results (or diagnostics).
FxCop analyzers are based on the .NET Compiler Platform ("Roslyn"). You install them as a NuGet package that's referenced by the project or solution. FxCop analyzers run source-code based analysis during compiler execution. FxCop analyzers are hosted within the compiler process, either csc.exe or vbc.exe, and run analysis when the project is built. Analyzer results are reported along with compiler results.
Note
You can also install FxCop analyzers as a Visual Studio extension. In this case, the analyzers execute as you type in the code editor, but they don't execute at build time. If you want to run FxCop analyzers as part of continuous integration (CI), install them as a NuGet package instead.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-analyzers-faq?view=vs-2019
So, the built-in legacy FxCop and NuGet analyzers only run at build time while the extension analyzers can run at the same time the JIT compiler does as you type. Also, you have to specifically say to run legacy code analysis on build, whereas the NuGet analyzers will run on build just because they are installed. And analyzers installed as NuGet or extensions won't run when you go to the menu option "Run Code Analysis".
At least, that's what I get out of that page.
There's a link near the bottom of that page that takes you to what code analysis rules have moved over to the new analyzers, including rules that are now deprecated.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-rule-port-status?view=vs-2019
The different analyzers attempt to cover different coding styles and things Microsoft didn't cover when they built FxCop. With the little research I just did on this, there's a whole rabbit hole to follow, Alice, that would take more time than I have right now to devote to it. And it seems to be filled with lots of arcane knowledge and OCD style code nitpicks that make Wonderland seem normal. But that's just my opinion.
There's lots of personal and professional opinion about various rules in these and basic Microsoft rules, so there's plenty of room to use what you want and disable what you don't. For a beginner, I'd suggest turning on only a few rules at a time. That way you aren't inundated with more warnings and errors than lines of code you might have. Ok, so that might be a bit of an exaggeration, but there's so many rules that really are nitpicks, especially on legacy code, that they aren't really worth it to have enabled, since you likely won't have time to fix it all. You will also want to do basic research and use "common sense" when you decide what to enable. ("Do I really need to worry about variable capitalization coding style consistency on an app that's been ported into 4 different languages over 15+ years and has 10k files?") This is both personal and professional opinion here, so follow it or not.
And don't forget the rules that contradict each other. Those are fun to deal with.......

Handling complex and large dependencies

Problem
I've been developing a game in C++ in my spare time and I've opted to use Bazel as my build tool since I have never had a ton of luck (or fun) working with make or cmake. I also have dependencies in other languages (python for some of the high level scripting). I'm using glfw for basic window handling and high level graphics support and that works well enough but now comes the problem. I'm uncertain on how I should handle dependencies like glfw in a Bazel world.
For some of my dependencies (like gtest and fruit) I can just reference them in my WORKSPACE file and Bazel handles them automagically but glfw hasn't adopted Bazel. So all of this leads me to ask, what should I do about dependencies that don't use Bazel inside a Bazel project?
Current approach
For many of the simpler dependencies I have, I simply created a new_git_repository entry in my WORKSPACE file and created a BUILD file for the library. This works great until you get to really complicated libraries like glfw that have a number of dependencies on their own.
When building glfw for a Linux machine running X11 you now have a dependency on X11 which would mean adding X11 to my Bazel setup. X11 Comes with its own set of dependencies (the X11 libraries like X11Cursor) and so on.
glfw also tries to provide basic joystick support which is provided by default in Linux which is great! Except that this is provided by the kernel which means that the kernel is also a dependency of my project. Now I shouldn't need anything more than the kernel headers this still seems like a lot to bring in.
Alternative Options
The reason I took the approach I've taken so far is to make the dependencies required to spin up a machine that can successfully build my game very minimal. In theory they just need a C/C++ compiler, Java 8, and Bazel and they're off to the races. This is great since it also means I can create a Docker container that has Bazel installed and do CI/CD really easily.
I could sacrifice this ease and just say that you need to have libraries like glfw installed before attempting to compile the game but that brings the whole which version is installed and how is it all configured problem back up that Bazel is supposed to help solve.
Surely there is a simpler solution and I'm overthinking this?
If the glfw project has no BUILD files, then you have the following options:
Build glfw inside a genrule.
If glfw supports some other build system like make, you could create a genrule that runs the tool. This approach has obvious drawbacks, like the not-to-be-underestimated impracticality of having to declare all inputs of that genrule, but it'd be the simplest way of Bazel'izing glfw.
Pre-build glfw.o and check it into your source tree.
You can create a cc_library rule for it, and put the .o file in the srcs. Even though this solution is the least flexible of all because you not only restrict the target platform to whatever the .o was built for, but also make it harder to reproduce the whole build, the benefits are sometimes worth the costs.
I view this approach as a last resort. Even in Bazel's own source code there's one cc_library.srcs that includes a raw object file, because it was worth it, as the commit message of 92caf38 explains.
Require that glfw be installed.
You already considered this option. Some people may prefer this to the other approaches.

Setting up a large software system in Delphi

We have a software package which is about 16 years old. It's gone through just about every version of Delphi (besides .NET ones). Over the years, things have become very confusing when it comes to cross-referencing and keeping proper setup for additional packages (like third-party libraries). I was wondering if there is some standard practice when it comes to keeping large projects (and groups of projects) like this organized.
So to explain the current setup...
This is a multi-application system. Meaning, there are 12 executable projects (and a few DLL and service projects) involved. We also keep things in SourceSafe and multiple developers work on the same code on different computers. All of these projects are more-so dumped into a central folder. The "Root" folder contains THE major EXE project (along with about 20 folders, all containing units and forms) and it almost seems like an endless hierarchy of folders and files. This one project alone has half a million lines of code involved.
Then all the additional applications aren't necessarily separated properly from this major project. Each of these projects has its own folder based in the main project's root.
The two major concerns of mine are:
How to properly set up the DCU files so that they aren't mixed in with the projects? DCU's should NOT be placed in the SourceSafe (and any similar file, for that matter) or otherwise, any file compiled from the project. Visual SourceSafe makes files read-only when they're not checked out, and DCU files (and EXE files and more) cannot be written to in this case. So how to properly separate any of such file to a remote location to avoid any mixture with the source code?
How to properly set up packages and libraries? We have the following:
QuickReports 5.05
NativeJpg library V302 -
Another anonymous reporting library
Our own component package, which requires QuickReports, NativeJpg, and the other anonymous library
All 4 of those libraries are stored in completely different places of each computer, and need some centralization. The biggest pain of setting up each new developer's computer is locating these from the lead developer's computer and copying them to the same place on each other computer (and making sure the library path is correct, etc.).
We also need to keep completely separate environments for different versions of Delphi on the same computer. This means a copy of the projects on each computer, a copy of packages and libraries on each computer, a copy of the projects and packages and libraries in the SourceSafe, etc. Each computer needs to have an identical setup. We already utilize environment variables to direct our projects where to look for certain project files (and libraries).
Another new concern: XE2 introduces 64bit capabilities. We don't plan on 64bit compiling yet, but we certainly will in the future. How do I properly differentiate 32bit from 64bit in all these projects?
What I'm really asking for is a reference to a good tutorial on how to optimize such an environment and keep it organized the best. I don't expect anyone to take the time and answer all this in the question. The projects are over 15 years old, have had the hands of 200+ developers from around the world in it, and has a LOT of cross-referencing between projects. For example, one project may use a unit from another project, and vice-versa. I personally don't like this concept, but I also didn't design it to begin with. I've been given the task to get this system organized and thoroughly documented how to set up Delphi on a new computer for new developers to work on our projects. As I'm looking at our projects (as I'm not necessarily a developer of the system, but am being pulled into development), I'm seeing a lot of confusion in how the code is organized.
I am assuming that possibly Embarcadero has some guidelines and standards on setting up such an environment?
Location of DCU files
Regarding the DCUs that are the output of the compilation process, you should specify a DCU output directory in each project file. The default value for this, in the latest version of Delphi would be fine: .\$(Platform)\$(Config). This results in sub-folders of the project directory like this: Win32\DEBUG or Win64\RELEASE.
If you set-up your project files using option sets then you will be able to control this setting (and all others) from a small number of option files.
Location of 3rd party code
You should always use 3rd party library as code. If the vendor charges more to receive the library as code, pay up. Once you have done so you simply include the source code into your version control system (VCS) and treat it largely the same way as you treat your own code. I say largely because you should avoid modifying it.
Once you have all your code in the VCS then you can put the entire source code onto a new machine with a single checkout operation.
Organisation of your projects
I personally have a strong aversion to using compiler search paths. I don't use them and include every unit that is required in a project in the .dpr file.
If you do use search paths then you make it impossible to work on variant projects.So for example, suppose you have a client that has discovered a bug in the version of the software you released 2 years ago. You would like to address that bug by releasing an upgrade to the 2 year old version of the software. It is perfectly plausible that asking them to upgrade to the latest version is not viable. Perhaps they have not paid for the upgrades. Perhaps the full upgrade has breaking changes that they do not want to tackle right now. A perfect example would be all the Delphi developers still using Delphi 7.
Now, having motivated the scenario, how would you create a build environment for the 2 year old project? If you are using search paths then they will refer to today's libraries. You would be forced to change your search path, or copy the old libraries over the top of today's libraries.
That entire headache is trivially side-stepped by not using search paths and by including all your source in the VCS.
What you should be aiming for is to be able to checkout any historic version of your program and have it build immediately. You should be able to do this with full confidence that you are building identical software to what was built at the time that version was released. This also requires you to have build automation but I can't imagine you are lacking that for a project of this size.
I'll address folder organisation. This comes from a software suite which has 50+ exe's and dll's and plenty third party libraries, so I guess I know where you are coming from...
We use Perforce as a source control system, so my default workspace's root folder is called Perforce, but I also have a couple of other workspaces set up and they are in Perforce2, Perforce3, etc.
General folder setup (starting from the workspace root folder)
General
Components
Delphi
Indy
Indy9
Indy10
MadCollection
v2.5.8.0
v2.6.0.0
Plugins
Releases
Released
... a folder for each release we publish ... (and equal to a branch in Perforce)
Work
Acceptance
Sub1
Sub2
My Environment library path in the IDE is empty (not even the BDE standard paths are in there). This ensures that a project's paths declare all path's needed and that projects are not reliant upon a particular machine's IDE setup.
We have an environment var (ie MRJ) set up in our IDE's that points to "General\Components\Delphi" so in a project's options we declare the paths to our components as $(MRJ)\MadCollection\2.6.0.0.
General holds IDE plugins and components used by our projects. We keep all versions we use in source control. That way when I have to switch back to an old release to track down a problem, I can simply pull it and build it as its library paths will still point to the version of the components that this specific release needs.
The organisation of folders in a particular work branch (Acceptance or one of its subbranches) follows this pattern:
General
Includes
MainComponent1
Project1
Project2
Shared
MainComponent2
Project3
Project4
Shared
Shared
Windows
SoftwareSuite
Scripts
Tools
MainComponent1
Project1
Dcus
Project2
Dcus
MainComponent2
Tools
Tool1
Dcus
Tool2
Dcus
The General folder holds all platform independent sources/files, the Windows folder holds all Windows specific files. Each component can hold multiple projects and will have a share folder for sources shared between those projects. The shared folder directly under General holds sources shared by all projects. The Windows folder is set up in a similar manner.
Note that each project has its own dcus folder. This is configured in the project options. As the path can be entered as .dcus, we (at least I) have this set up as the default for any new project. Each project sending its dcus to a unique folder ensures two things:
it is easy to keep dcu's out of version control by simply setting up a filter in your version control software.
more importantly it ensures that compilation/build of a project never interferes with the compilation/build of another project. I can safely change settings and build knowing that I won't be bothered by dcu's lying around from a previous build from another project.
I recommend the following practices:
Keep your library path simple, and make sure everything in the library path is either a folder that ships with delphi, or a DCU binary (library) folder in your d:\Components\ folder.
Use a MODERN type of version control. I recommend Mercurial over others. Source Safe is crap, stop using it.
Back up your environment (export registry keys etc) and restore it to the other developer PCs in a standardized way. You can keep a few .reg and .cmd (batch) files around to automate setup of a new system. you can put these scripts in your component repository in your version control system.
Outside the scope that was largely discuss before, I would recommend :
Unit testing - with DUnit for example
Continuous integration. Just to be sure that all these projects can compile on another machine and that tests are ok.
So this is heavily related to project organization and VCS strategy.
For a similar setup, a company I worked for found this configuration useful:
all third party libraries (components etc.) go to a fixed location (C:\Delphi\name-version)
Delphi projects can be checked out from version control anywhere (drive C: or D: and folder name does not matter), as all projects and scripts use relative paths
all projects are sub folders of one main project folder so checking out this one will bring the Delphi projects and other relevant resources to the workstation, and a version control update is easy to do
we use a build script (written in Apache Ant) which sits in the main folder, and iterates over all folders to build the Delphi apps and run unit and integration tests against a development database server, to verify all changes work before checking in to source control
the build script can also be run automatically on a build server (Hudson CI) on every commit to see if something broke
And a note about component libraries: avoid package installation where possible, prefer creation of components at run time. If you quickly need to apply a fix to a five year old version of a project, uninstalling / installing a dozen of packages can become frustrating. At least for non-visual components, run-time creation is a huge time saver.
Checking in third party code in source control can be very helpful, for example to share fixes which are not yet available as new official releases. Best practices are covered in the Subversion documentation chapter Vendor Branches.
Plus, with Subversion you can use svn:externals to place a specific version (tag) right into a project directory structure. This can be used both with third party library and with your own source code, and makes dependency management easier and workstation setup easier.
p.s. the Ant build script defines the search paths for everything, so it is 'the reference' for all developers how to configure the IDE, where to put the third party libs and which compiler flags to use
p.p.s. your project sounds like a lot of fun - I am open for contract work :)
My team use virtualization and when we see back it was a real good move.
We use MacBook Pro laptops and VmWare Fusion, but I'm sure other packages work fine as well like VirtualBox or VirtualPC.
It is always a good feeling to know that when a new developer starts or an old installation got trouble it is just to copy a new VM image from the master image and the setup is exactly as the original. The master image is stored on a fast USB2-disk. Now when Thunderbolt and USB3 is coming it would be even faster to copy an image. And there is no real concern about performance on a modern computer as long as there is memory. 8 GB should be enough to run 2 images in parallell. Another advantage of virtualization is that it is so easy to try What if scenario. Experiment with different configuarations and versions without any risk to disturb the real working environment.
Btw I also think that SourceSafe is crap... :-)
Somé tips:
Make one groupproject file for all the apps belonging to the project, each app in its own dir under the groupproj file
You should be able to specify which file types to include into your version control system. Make sure you set Delphi to write DFM files in text format.
You could tell Delphi to output DCUs in subdirs named 'dcu' under each app (less visaul clutter).
Third party stuff often insists on installing in distinct locations, there's not much you can do about it. Make a document describing how to setup a complete working environment and keep it up-to-date
Develop in virtual machines. A new developer gets a copy of the VM.
Maintaining for different Delphi versions? Rethink that, try to go to one version. If you absolutely must have two groupprojects and directory structures for each version. [I'm assuming you're not compiling the same app with two Delphi version, that's developer hell]
Delphi XE2 will output to different 32/64 subdirectories, that should give no problems.

Best practices for using SVN with Delphi Visual Component packages?

With the desire to be able to reproduce a given revision of a project that is utilizing 3rd party visual component packages, what goes in SVN and what's the best way to implement/structure the SVN repos?
For non-visual components, the rule seems simple to ensure no reliance on outside repos - "no svn-externals reference to any outside repo allowed". I have a shared repo that I control, which is the only 'svn-externals' reference allowed. This makes it easy to implement and share these types of runtime itemss with sourcecode in different SVN projects. Any reference this internal shared repo is by 'svn-externals' using a specific revision number.
Visual packages seem to go counter to being able to be version controlled easily as they may have to be reinstalled at each revision. How to best create a SVN project which is able to be recreated later at a specific revision number...is there a recommended solution?
Previously we didn't worry about 3rd party components as they don't change often and we never had a real good solution. I was wondering if others have figure out the best way to handle this problem as I'm doing a spring cleaning/internal reorganization and wanted to do it 'better' than before.
Technically, the RTL/VCL source should also be in the SVN repo as well (if there's a Delphi hotfix/service pack released.)
My solution will likely be to create a virtual machine with a particular release of the Delphi environment with all visual controls installed. As we add/update visual controls, or update Delphi with hotfixes/service packs then we create a new version of the virtual machine. We then store an image of this VM revision on a shelf somewhere. Is this what you do? Does the Delphi activation/licensing work well (or at all) in this scenario?
Thanks,
Darian
You can prepare "start IDE" (and possibly "build") scripts for your projects and maintain them as project evolves in repository.
Regardless of your decision about keeping components in separate repositories and using externals, or including them in a single repository with possible branching, you should also include compiled bpl files for every component build and for every branch prepared for a specific Delphi version.
You should definitely try to keep most (if not all) of paths relative, in a worst case use environment variables to point to your root project dir.
Start IDE script allows you to keep each project and Delphi version environment spearately configured on a single Windows installation.
It should include necessary registry keys for your project and Delphi:
Windows Registry Editor Version 5.00
[-${DelphiRegKey}\Disabled Packages]
[-${DelphiRegKey}\Known Packages]
[-${DelphiRegKey}\Library]
[${DelphiRegKey}\Known Packages]
"$(BDS)\\Bin\\dclstd${CompilerVersion}.bpl"="Borland Standard Components"
"$(BDS)\\Bin\\dclie${CompilerVersion}.bpl"="Internet Explorer Components"
"$(BDS)\\Bin\\dcldb${CompilerVersion}.bpl"="Borland Database Components"
(...)
"${CustomComponentPack}"="Custom Components"
[${DelphiRegKey}\Library]
"Search Path"="${YourLibrarySourceFolder1};${YourLibrarySourceFolder2}"
(...)
You can then prepare batch file:
regedit /s project.reg
%DelphiPath%\bin\bds -rProjectRegKey Project.dpr
Where ${DelphiRegKey} is HKEY_CURRENT_USER\Software\Borland(or CodeGear in newer versions)\ProjectRegKey.
Basically it is easier when you will dump your current working configuration from registry, strip it from unnecessary keys, change paths to relative and then adapt to make it work with your project.
In such configuration, switching between projects and their branches which have different sets of components (and/or possibly using different Delphi version) is a matter of checking out a repository only and running the script.
Fortunately for us, we don't have to worry about a hotfix/service pack; we're still on Delphi 5. :D
Sigh, there was a time when an entire application (settings and all) would exist within a single directory - making this a non-issue. But, the world has moved on, and we have various parts of an application scattered all over the place:
registry
Windows\System
Program Files
Sometimes even User folders in "Application Data" or "Local Settings"
You are quite right to consider the impact of hotfixes/service packs. It's not only RTL/VCL that could be affected, but the compiler itself could have been slightly changed. Note also that running on the same line of thought, even when you upgrade Delphi versions, you need to build using the correct version. Admittedly this is a little easier because you can run different Delphi versions alongside each other.
However, I'm going to advise that it's probably not worth going to too much effort. Remember, working on old versions is always more expensive than working on the current version.
Ideally you want all your dev to be be on main branch code, you want to minimise patch-work on older versions.
So strive to keep the majority of your users on the latest version as much as possible.
Admittedly this isn't always possible.
You wouldn't want to jump over to the 'new version' without some testing first in any case.
Certain agile processes do tend to make this easier.
By using a separate build machine or VM, you already have a measure of control.
TIP: I would also suggest that the build process automtically copy build output to a different machine, or at least a different hard-drive.
Once you're satisfied with the service pack, you can plan when you want to roll it to your build machine.
It is extremely important to keep record of the label at which the build configuration changed. (Just in case.)
If your build scripts are also kept in source control, this happens implicitly.
When you've rolled out the hotfix/service pack, fixes to older versions should be actively discouraged.
Of course, they probably can't be eliminated, but if it's rare enough, then even manual reconfiguration could be feasible.
Instead of a VM option to keep your old configuration, you can also consider drive-imaging.
To save on the $$$ of VMWare LabManager, look for a command-line driven VM Player.
You might have to keep 2 "live" machines/VMs, but should never need more than that.
It's okay for an automatic build script to fail because the desired configuration isn't available. This will remind you to set it up manually.
Remember, working on old versions is always more expensive than working on the current version.
Third Party Packages
We went to a little bit more effort here. One of our main motivations though was the fact that we use about 8 third party packages. So doing something to standardise this in itslef made sense. We also decided running 8 installation programs was a PITA, so we devised an easy way to manually install all required packages from source-control.
Key Considerations
The build environment doesn't need any packages installed, provided the object and/or source files are accessible.
It would help if developers could fairly easily ensure they're building with the same version of third party libraries when necessary.
However, dev environments usually must install packages into the IDE.
This can sometimes cause problems with source compatibility.
For example new properties that get written to IDE maintained files.
Which of course brings us back to the second point.
Since Third Party packages are infrequently updated, they are placed within a slightly different area of source-control.
But, NB must still be referenced via relative paths.
We created the following folder structure:
...\ThirdParty\_DesignTimePackages //The actual package files only are copied here
...\ThirdParty\_RunTimePackages //As above, for any packages "required" by those above
...\ThirdParty\Suite1
...\ThirdParty\Suite2
...\ThirdParty\Suite3
As a result of this it's quite easy to configure a new environment:
Get latest version of all ThirdParty files.
Add _DesignTimePackages and _RunTimePackages to Windows Path
Open Delphi
Select Install Components
Select all packages from _DesignTimePackages.
Done!
Edit: Darian was concerned about the possibility of errors when switching switching versions of Design Packages. However, this approach avoids those kinds of problems.
By adding _DesignTimePackages and _RunTimePackages to the Windows Path, Delphi will always find required packages in the same place.
As a result, you're less likely to encounter the 'package nightmare' of incompatible versions.
Of course, if you do something silly like rebuild some of your packages and check-in the new version, you can expect problems - no matter what approach you follow.
I usually structure my repository in SVN like this:
/trunk/app1
/trunk/comp/thirdparty1
/trunk/comp/thirdparty2
/trunk/comp/thirdparty3...
I have, right in the root folder (trunk) a project group (.groupproj, or .bpg on old delphi) that contains all my components. (allcomponents.groupproj).
Installing on a new machine, means opening that package, and installing the designtime components. That's a drag on all versions of Delphi older than 2010, but 2010 and XE have a lovely feature so you can see at a glance, which components are designtime components.
I also, sometimes, will save myself the trouble of installing those components by hand, by making a build.bat file, and a regcomponents.bat file. The regcomponents just runs regedit , and imports the keys needed to register all those components, after build.bat has built them, and everything else.
When you move up from one delphi version to another, it's sure good to have both a batch and reg file, and a group project, to help you. Especially if you have to go through and do a lot of opening of project/packages and saving them as MyComponent3.dpk instead of MyComponent2.dpk, or updating the package extension from 150 to 160, or whatever your packages do.

Loading Delphi designtime packages on a project base

Is there a way to select designtime packages on a project bases?
Packages are very useful in large project to keep the build time acceptable, but they are a real pita in those large projects too. When one developer adds a new package, it breaks to build for all other until they install the new package on their machine. And then there is versioning of the packages ...
So has anyone a proper solution for this? (it has been bothering me for years now)
At my previous job I wrote a little tool to help us with versioning packages. I really should recreate that tool in my spare time and make it available. The tool was not hard to write though, so maybe you can implement something like it yourself.
Basically it worked like this:
Subversion repo with all the packages in subfolders. Each package folder in the repo had the same subfolders: Lib (for DCUs), Source, Help (if needed)
In the root folder of the repo sits the tool together with an XML file.
The XML file specified all needed information for each package: which folder contained DCUs, which folder contained source, which command needed to be run for the help.
The tool reads in the XML and displays a checklistbox of all the available packages. Installed packages (read from the BDS registry) are marked checked.
The user can make a selection of which packages to install/uninstall.
The tool adds/removes the necessary keys in the BDS registry. It adds the DCU/Lib folder into the search path of the IDE, it adds the source folder to the IDE browsing path, and it registers the help command with a custom IDE expert (This expert provides an extension to the default help menu to launch the help for all the installed packages)
The tool would even check for conflicts and dependencies between packages. For example both version 3 and 4 of Raize Components were available which could not both be active at the same time. Dependencies checking was useful for in-house components that derived from TurboPower AsyncPro (lot of in-house components relied on serial communication via AsyncPro)
A possible extension would have been to be able to save/load the selection of packages and store that selection with each project so you can have only the packages loaded that are needed for a particular project.
I implemented all this when the company was moving from Delphi 5/7 to Delphi 2007. We had a lot of problems with package versioning before and wanted some way to version all the different packages.
This approach offered some nice advantages:
When bugfixes had to be made or new versions of third-party packages were released, one person had to commit the changes to subversion. All the other devs could just do an update from subversion and have the latest version without any problems.
When new component packages would be added to the environment, one person had to commit all the files, change the XML package list and then the other devs could do an subversion update and run the tool to integrate the package easily.
All third-party and custom in-house components were now versioned easily.
By including the DCUs (and other binaries) in the subversion repo, we ensured that all devs used the same compiled version. Before it was possible that different compilations used different settings which caused some components to behave differently.
When all the other devs finally installed Delphi 2007, their packages were setup in less than 10 minutes (most of the time spent downloading everything from the subversion repo; the tool itself could install 20 packages in less than 2 seconds). Before, with the manual installation of all packages for Delphi5/7 it could take up to 2 days to get everything installed.
This wasn't just used for some in-house components alone, the repo also included some of the big component packages: Raize Components, JCL/JVCL (Using their installer instead of the tool though), DevExpress Quantum Grid 3 and 4, TurboPower AsyncPro
This is not easy too do. You can do it though, with the use of a custom registry hack, and a specific bds shortcut per configuration you are interested in:
To use, just create a new shortcut and
modify the command line to pass e.g.
-rMyAlternateBDSReg. Then after launching that once, the reg entry is
created and they can configure that
alternate registry all they want,
deleting packages, etc, without
worrying about messing up the default
install.
From codegear
If you set up a configuration for each project, you can then start the appropriate shortcut for the given project. It's not automatic, but it is better than having everything there all the time.
A nice side effect is that the load times will be improved.
We put the source for our packages in source control along with a batchfile that rebuilds them. If there is a change in the tree for packages then we rebuild them. This doesn't address installing new packages, but there are registry hits that can take care of that, so it is possible that we could include .reg snippets maybe to handle that.

Resources