compile DELPHI code with different compiler directives - delphi

We use Delphi 10 Seattle. Inside our code we use different compiler directives to produce different exe files, like debug, release version, or versions with different features sets (disable some functions for different customers ...)
Current way of getting the exe files:
change compile options by hand manually typing inside DELPHI IDE ...
compile and copy *.exe to a new location by hand
Any way to get this with one button click (faster approach ...)

Create a build configuration for each of your different feature sets.
Documentation for build configurations is here: Build Configurations Overview. This is the mechanism that the IDE provides for switching between debug and release builds, and there is no reason at all for you not to use the same mechanism to switch between your own configuration sets.
The option set feature allows you to extract certain sets or groups of options into separate files which can then be applied to configurations. You can apply the same option set multiple times, for instance once on top of a base release configuration, and then again on top of a base debug configuration. The build configuration functionality supports inheritance which makes it possible, with a bit of up-front design, to develop a clean hierarchy of configurations.
This configuration functionality is built on top of msbuild which means that you can use the same configurations in your command line builds. In fact, the fact that you mention changing configurations manually in the IDE is a concern. Building your product is not something that should require manual intervention. It is critically important that you address this and arrange that your build process is automated. You can use tools for this, although it is very easy to write your own simple tools that will invoke msbuild for all the configurations that you need to build. Please don't use the IDE to build for release.
As an aside, build configuration is one area of the product where the functionality of RAD Studio surpasses that offered by Visual Studio, in my view.

Related

How to modify TFS build automation workflow without .Net Compiler, need to use Team Developer's Language compiler

I am trying build automation for a project developed using legacy language called Team Developer 6. where each file needs to be compiled as an exe. also need to do some filter activity before building exes. there are 300 exes.
this process I could do in simple .Net utility which does the filtering and invokes Team Developer compiler for required files.
Is it possible to put this in to TFS build work flow? what is the best approach for this?
Write an MSBuild project that invokes the necessary commands for the tooling you require and check it in. In the TFS build definition, make use of the default template (at first) and set the MSBuild project file you created as the 'project to build'.
This way you can test your build process locally with MSBuild on the command line, and determine which command line switches you might need. You can set command line switches into the build definition, or if you need some further control you can modify the default template to inject the command line switches directly into the MSBuild activity.
I recommend this way, as then you won't have to create any customized workflow, and can avoid having to go down the road of using custom workflow activities in TFS (which is absolutely supported, but in my opinion a bit difficult to diagnose/debug/maintain/upgrade).
You would ideally want to use an InvokeProcess activity to call an executable which does the filtering and invoking. An alternative but more complex approach would be to create a custom activity, but that requires installation of binaries on the build servers.

Deploying Custom Units and Components

When it has come to redoing or reinstalling Delphi, I've run into a hassle. When it comes to components and units I've produced to use in projects, I run into having to go through the entire backup of my projects to find all the things I've used in other projects and copy the units over, install the components through the Delphi interface, and make sure everything is present. Then, I usually forget something and then when I pull out a project that uses one of these units or components, I have to stop whatever I'm doing, find the backup disk, find the data do the install, before I continue...
Main question: Has anyone come up with anything to solve this scenario by automating all of this? Otherwise, what do most people here do when it comes to administration of Delphi in this way?
Some tips:
when possible, avoid installation of components and create instances at run time. This will reduce the time to install them in the IDE. For example, all non-visual components do not have to be installed for design mode.
use a build tool like Apache Ant to compile projects with a build script. The build script then also serves as documentation of environment and source path requirements. When I run the build on a new computer, I only need to check the Ant build script configuration file to see which dependencies exist.
Everytime I produce my own components I consider them as a product I would sell. In this sense, what I do is to build a setup wizard that installs the components in Delphi IDE in the very same way it would for a customer.
Anytime I have to reinstall my computer or Delphi, I just have to run my setup wizards and all the work environment gets ready.
I use InnoSetup (http://www.jrsoftware.org/isinfo.php) to build my setup wizards.
I set up Environment Variables
Delphi menu \Tools\Options\Environment Variables
New User Overrides, Example: Variable Name: OutsideComponents; Variable Value: C:\mycompdir\mycomp
Lots of options in how to use the EV's
You can Set them up to use for all your projects...
Delphi Menu \Tools\Options\Delphi Options\Library path Example: $(OutsideComponents)\
Or just link use them in the project..
Delphi Menu \Project\Options\Directories/Conditionals\Search Path Example: $(OusideComponents)\Comp1

Teambuild / MSBuild and stamping QA-approved builds

We have an automated build and QA process for our software, using tfs/teambuild and msbuild, and we want to be able to know (for audit purposes) whether a component has gone through that process or not.
For example, if a library is installed on a user's machine, I'd like to be able to inspect it in some way to tell that it went through the build. In particular, I want to be able to distinguish it from components built directly on a developer's machine, and then manually installed.
What is the best way to do this? Code signing as part of the build process seems closest to these requirements, but presumably this would not cover any 3rd-party libraries that might be used? I also read about the ILMerge tool to merge all assemblies into one, but then I don't know enough to work out whether they can then be signed or not?
I'm sure we're not the first people to have the requirement, so casting around for any ideas or hints from others who might have done such a thing
Thanks!
Our developer builds are set to keep the versions at "0.0.0.0", but our build server marks the build based on a pre-configured version and automagically generated build string. "1.0.3.xxx". Your build server doesn't allow for this?
Your build process should be updating each of your projects assemblyinfo.cs files (or a global linked equivalent), you can do this with the TFS changeset number, so like the previous poster indicated you end up with the property on each dll of 1.0.changeset.buildno or something similar. You can do this easily in msbuild.
You could have the values of each assembly info file set in source control to be something obvious like 0 or 999.
A lot of what your asking is about process and training as well though.
If your using installers or zips to package your deliverables then you can also label them with the build number as part of your build process.
But if you have changeset you have the link from dll to code, so traceable, coupled with links to third party dll references as defined in each csproj.

Team Build: Publish locally using MSDeploy

I'm just getting started with the team build functionality and I'm finding the sheer amount of things required to do something pretty simple a bit overwhelming. My setup at the moment is a solution with a web app, an assembly app and a test app. The web app has a PublishProfile set up which publishes via the filesystem.
I have a TFS build definition set up which currently builds the entire solution nightly and drops it onto a network share as a backup of old builds. All I want to do now is have the PublishProfile I've already setup publish the web app for me. I'm sure this is really simple but I've been playing with MSBuild commands for a full day now with no luck. Help!
Unfortunately sharing of the Publish Profile is not supported or implemented in MSBuild. The logic to publish from the profile is contained in VS itself. Fortunately the profile doesn't contain much information so there are ways to achieve what you are looking for. Our targets do not specifically support the exact same steps as followed by the publish dialog, but to achieve the same result from team build you have two choices, I will outline both here.
When you setup your Team Build definition in order to deploy you need to pass in some values for the MSBuild Arguments for the build process. See image below where I have highlighted this.
Option 1:
Pass in the following arguments:
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
Let me explain these parameters a bit, show you the result then explain the next option.
DeployOnBuild=true:This tells the project to execute the target(s) defined in the DeployTarget property.
DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder: This specifies the DeployTarget target.
PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish": This specifies the location where the package files will be written. This is the location where the files are written before they are packaged.
AutoParameterizationWebConfigConnectionStrings=false: This tells the Web Publishing Pipeline (WPP) to not parameterize the connection strings in the web.config file. If you do not specify this then your connection string values will be replaced with placeholders like $(ReplacableToken_dummyConStr-Web.config Connection String_0)
After you do this you can kick off a build then inside of the PackageTempRootDir location you will find a PackageTmp folder and this contains the content that you are looking for.
Option 2:
So for the previous option you probably noticed that it creates a folder named PackageTmp and if you do not want that then you can use the following options instead.
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;_PackageTempDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
The difference here is that instead of PackageTempRootDir you would pass in _PackageTempDir. The reason why I don't suggest that to begin with is because MSBuild properties that start with _ signify that the property in essentially "internal" in the sense that in a future version it may mean something else or not exist at all. So use at your own risk.
Option 3
With all that said, you could just use the build to package your web. If you want to do this then use the following arguments.
/p:DeployOnBuild=true;DeployTarget=Package
When you do this in the drop folder for your build you will find the _PublishedWebsites folder as you normally would, then inside of that there will be a folder {ProjectName}_Package where {ProjectName} is the name of the project. This folder will contain the package, the .cmd file, the parameters file and a couple others. You can use these files to deploy your web.
I hope that wasn't information over load.
The ability to publish web sites, configure IIS and push schema changes for the DEV->QA->RELEASE cycle has required either custom configuration to imitate publish or custom code where IIS settings are involved.
As of Visual Studio 2013.2 Microsoft has added a third party product that manages deployment of web sites, configuration changes and database deployment with windows workflow and would be the recommended solution for automating deployment from TFS build.
More information can be found here:
http://www.visualstudio.com/en-us/explore/release-management-vs.aspx
You can use the Publish/Deploy in Visual Studio 2010.
See http://www.ewaldhofman.nl/post/2010/04/12/Auto-deployment-of-my-web-application-with-Team-Build-2010-to-add-Interactive-Testing.aspx for more information

Using Project Config with Delphi 2009 and Finalbuilder 6

Does anyone have any experience of using the project config option with the Delphi Compile Action in Finalbuilder 6.
Currently the build server is set up to manually have all the correct search paths, Compiler options etc set directly on the compile action. This avoids changes to the config file breaking the build, and also search paths etc may be different to dev machines. However it can be a pain to change as it needs to be done in FB rather than in the project.
With the new BuildConfig options in Delphi2009 (and support for them in Finalbuilder 6) it
should be possible to keep this config in SCC and make it easier to maintain and update.
Has anyone tried this?
I use a virtual machine to make a stable build environment, and keep all the options in FinalBuilder. This way I know that I can make a one line change and it will be a good build, with no way I can break things by changes on my dev PC. Of course I do have to update the components etc occasionally, but that can be snapshotted and tested more carefully.
So I guess I've not tried the build config, but I don't want to!

Resources