What is the minimal agent install footprint for Delphi build automation? - delphi

When creating a build server that does clean version control check-outs and full system builds of everything in a given source repository or project, what is the minimum required Delphi install footprint for XE3 Win32/Win64 projects? (Core system - not 3rd party components)
I'd prefer to have a small subset of files that can be included in a repository rather than a full Delphi install.

Using the command line compilers, you do not need to install the IDE on the remote agent computer. From a running installation, copy the \Bin and \Lib sub-folder content into your remote agent.
Then run DCC32.exe command line compiler, changing the DCC32.CFG file content to point to all needed source code paths. Do not forget to set a temporary folder for all your .dcu generated files, and specify a .exe destination folder.
See
How do I compile my delphi project on the command line?
and the official documentation of the command-line compiler.
Update: Yes, I know, MSBuild is the "official way". But for a build agent, I found out how easier it is to work with the command line compiler. And the question here is about the "minimal footprint" to build Delphi apps.
With DCC32, nothing to install, no need to reproduce the same layout than the one in the IDE. Build environment should not be tied to the IDE configuration, and should be "clean" from any developer specificity, from my experiment. It should build from scracth and from source all your application, run the unit tests, and prepare the release. I've seen some .dcu or .bpl polluting the build process, taking hours to find out why a code modification did not be taken in account!
If you need some complex build process, I always prefer to code some lines of Delphi (or python), reading the configuration from text files, regardless of the computer the build agent runs on. It is a PITA to install Delphi on a computer just for a build (especially latest versions), and even if the license allows you to do so, whereas the command-line compiler is safe and fast to install/setup. Just copy the files, and run them. If your build agent is a virtual server (as it should be today), your IT will be pleased not to pollute the registry. If just your company IT would prefer Delphi because of its cleanness in regard to other frameworks, it is always worth it.

Found a source for D2006+D2007:
https://delphi.fandom.com/wiki/Setting_up_a_Delphi_Build_Machine

Related

Exposing non-empty docker container directories to host

Since one can have a nice Docker container to run an entire build in, it would be fantastic if the tools used by the container to build and run the code would be accessible to the host.
Imagine the following use-case:
Imagine that you're developing a Java application using OpenJDK 12 and Maven 3.6.1 in order to build, run all tests and package the entire application into an executable .jar file.
You create a Docker container that serves as a "build container". This container has OpenJDK 12 and Maven 3.6.1 installed and can be used to build and package your entire application (you could use it locally, during development and you could also use it on a build-server, triggering the build whenever code changes are pushed).
Now, you actually want to start writing some code... naturally, you'll go ahead and open your project in your favorite IDE (IntelliJ IDEA?), configure the project SDK and whatever else needs to be configured and start rocking!
Would it not be fantastic to be able to tell IntelliJ (Eclipse, NetBeans, VSCode, whatever...) to simply use the same tools with the same versions as the build container is using? Sure, you could tell your IDE to delegate building to the "build container" (Docker), but without setting the appropriate "Project SDK" (and other configs), then you'd be "coding in the dark"... you'd be losing out on almost all the benefits of using a powerful IDE for development. No code hinting, no static code analysis, etc. etc. etc. Your cool and fancy IDE is in essence reduced to a simple text editor that can at least trigger a full-build by calling your build container.
In order to keep benefiting from the many IDE features, you'll need to install OpenJDK 12, Maven 3.6.1 and whatever else you need (in essence, the same tools you have already spent time configuring your Docker image with) and then tell the IDE that "these" are the tool it should use for "this" project.
It's unfortunately too easy to accidentally install the wrong version of the tool on your host (locally), that could potentially lead to the "it works on my machine" syndrome. Sure, you'd still spot problems later down the road once the project is built using the appropriate tools and versions by the build container/server, but... not to mention how annoying things can become when having to maintain an entire zoo of tools an their versions on your machine (+ potentially having to deal with all kind of funky incompatibilities or interactions between all the tools) when you happen to work on multiple projects (one project needs JDK 8, the other JDK 11, the other uses Gradle, not Maven, then you also need Node 10, Angular 5, but also 6, etc. etc. etc.).
So far, I only came across all kind of funky workarounds, but no "nice" solution. The most tolerable I found so far is to manually expose (copy) the tools from the container on the host machine (e.g.: define a volume shared by both and then execute a manual script that would not copy the tools from the container into the shared volume directory so that the host can access them as well)... while this would work, it unfortunately involves a manual step, which means that whenever the container is updated (e.g.: new versions of certain tools are used or even additional, completely new ones) then the developer needs to remember to perform the manual copying step (execute whatever script explicitly) in order have all the latest and greatest stuff available to the host once again (of course, this could potentially mean updating IDE configs as - but this - version upgrades at least - can be mitigated to a large degree by having the tools reside at non-version specific paths).
Does anyone have some idea how to achieve this? VM's are out of the question and would seem like an overkill... I don't see why accessing Docker container resources in a read-only fashion should not be possible and reuse and reference appropriate tooling during both development and build.

Delphi XE3 build batch failed to run msbuild using Hudson or Jenkins

I try to configure Hudson (3.1.0) to build Delphi XE3 project (MSBuild).
Batch for build:
call "C:\Program Files (x86)\Embarcadero\RAD Studio\10.0\bin\rsvars.bat"
msbuild "X:\Tests\DelphiTest\Project1Test.dproj" /t:Build /v:minimal /p:config="Debug"
Run this batch in command line (cmd.exe) - build correct.
When I put this batch into Hudson build step faild with errors from delphi compiler:
[..]
C:\Program Files (x86)\Embarcadero\RAD Studio\10.0\bin\CodeGear.Common.Targets : warning : Expected configuration file missing - C:\Windows\system32\config\systemprofile\AppData\Roaming\Embarcadero\BDS\10.0\EnvOptions.proj
_PasCoreCompile:
Embarcadero Delphi for Win32 compiler version 24.0
Copyright (c) 1983,2012 Embarcadero Technologies, Inc.
C:\Program Files (x86)\Embarcadero\RAD Studio\10.0\Bin\CodeGear.Delphi.Targets(172,5): error E1026: File not found: 'Controls.res'
[..]
In fact I don't have missing file in this location. But I've found it in
C:\Users\<username>\AppData\Roaming\Embarcadero\BDS\10.0
So I try do dummy think like copy existing dir to missing location but it doesn't work.
I'm using: Delphi XE3 Enterprise, Win 7 Ultimate (x64)
Thanks for help.
this usually happens the first time you run MSBUILD or DCC32 on a machine which has never had the IDE gui open (bds.exe) and which has passed the mandatory startup environment configuration tasks that happen inside BDS.exe.
If you log in on Hudson or Jenkins as a Windows service using an account that has never been used for Windows logins, has never run BDS.exe, you will get this issue. The MSBUILD utility from Microsoft must invoke DCC32, which is licensed commercial software, and the Delphi command line compilation licensing and environment configuration requires that you have run the IDE as the account, and that your jenkins build account and computer is licensed properly.
Solution:
Either change Jenkins or Hudson to log in as an account that has a license to run Delphi, and has run Delphi at least once.
If you can't change the server login, then log in interactively (via remote desktop) using the account that Jenkins/Hudson is using, and configure Delphi (Rad Studio) so it runs fine from that account.
Update 2015: I am now running into a whole new class of problems related to Group Managed Service Accounts. They were working with Delphi XE8, but I am not able to get a GMSA account to work with Delphi 10 Seattle. I'll update this answer if I ever do figure out a way, but for now, I recommend conventional service accounts, not GMSAs, for Delphi continuous integration with Jenkins or Continua, or others. Also check Delphi.wikia.com for more tips: http://delphi.wikia.com/wiki/Setting_up_a_Delphi_Build_Machine
We recently ran in a similar kind of problem. We are trying to compile our projects on an jenkins. We simplified the "build" script to the following 2 lines
call "[....]Embarcadero\Studio\18.0\bin\rsvars.bat"
MSBuild.exe /p:config=Release /target:Rebuild Project2.dproj
This is building the project successful.
But when we switch to /p:config=debug the build fails with the known errors about missing *.res files. We looked up in the delphi lib/win32/ directory and realized that there are *.res files in the release folder but not in the debug folder. We now further modified the msbuild command to this:
MSBuild.exe /p:config=Release /target:Rebuild /p:ResourcePath="%BDS%\lib\win32\release" Project2.dproj
And this is compiling just fine. So apparently the compiling in debug mode lacks the path to the delphi sources (which are included in several subfolders in %BDS%\source\
So as a workaround we can compile debug builds using jenkins and simply giving delphi the *.res files from its release folder. But that can only be an intermediate solution.
Maybe this information helps someone to find a final solution which is not "Adding all subfolders of %BDS%\source into the search path" ;-)

How to use NuGet packages on build server/production server without internet?

Background
I have the following components:
My local solution (.NET 4.5) which makes use of NuGet packages.
A PowerShell build script in my solution that has targets to build, run unit tests, to Web.config transforms, etc.
A build server without an internet connection running CruiseControl.NET that calls my build script to build the files. It also serves as the (IIS7) environment for the dev build.
A production server with IIS7 that does not have internet access.
Goal
I would like to utilize NuGet packages from my solution and have them be stored locally as part of source -- without having to rely on an internet connection or nuget package server on my build and production servers.
Question
How can I tell MSBuild to properly deploy these packages, or is this the default behavior of NuGet?
Scott Hanselman has written an excellent article entitled How to access NuGet when NuGet.org is down (or you're on a plane). If you read through this article, you'll see at the end that the suggestions he makes are primarily temporary-type solutions and he goes out of his way to say that you should never need the offline cache except in those emergency situations.
If you read at the bottom of his article, however, he makes this suggestion:
If you're concerned about external dependencies on a company-wide
scale, you might want to have a network share (perhaps on a shared
builder server) within your organization that contains the NuGet
packages that you rely on. This is a useful thing if you are in a
low-bandwidth situation as an organization.
This is what I ended up doing in a similar situation. We have a share which we keep with the latest versions of various packages that we rely on (of course, I'm assuming you're on some type of network). It works great and requires just a little work to update the packages on a semi-regular basis (we have a quarterly update cycle).
Another article that may also be of help to you (was to me) is: Using NuGet to Distribute Our Company Internal DLLs
By default, Nuget puts all your dependencies in a packages/ folder. You can simply add this folder to your source control system and Nuget will not need to download anything from the internet when you do your builds. You'll also want to ensure that Nuget Package Restore isn't configured on your solution.
You'll have to make a decision; either you download/install the packages at build time (whether it be using package restore, your own script, or a build tool that does this for you), or you put the /packages assemblies in source control as if they were in a /lib directory.
We've had so many problems with using package restore and NuGet's Visual Studio extension internally that we almost scrapped NuGet completely because of its flaws, despite the fact that 1 of our company's 2 products is a private NuGet repository.
Basically, the way we manage the lifecycle is by using a combination of our products BuildMaster and ProGet such that:
ProGet caches all of our NuGet packages (both packages published by ourselves and ones from nuget.org)
BuildMaster performs both the CI and deployment aspect and handles all the NuGet package restoration so we never have to deal with massive checked-in libraries or the solution-munging nightmare that is package restore
If you were to adopt a similar procedure, it may be easiest to create a build artifact in your first environment which includes the installed NuGet package assemblies, then simply deploy that artifact to your production without having to repeat the process.
Hope this helps,
-Tod
I know this is an old discussion, but how in the world is it bad to store all the files required to build a project because of the size?
The idea that if a library is not available that you should replace it is crazy. Code cost money and since you don't control the libraries on git, or in nuget, a copy should be available.
One requirement that many companies have is an audit. What if a library was found to steal your data. How would you know for sure if the library is removed from NUGET and you can't even build the code to double check.
The one size fits all Nuget and git ways of the web are not OK.
I think the way Nuget worked in the past, where the files were stored locally and optionally place in source control is the way to go.

TFSBuild 2010 Package only contains sources - not binaries

The packages created by a TFS 2010 Build only contain our Sources, not the binaries. When this is (automatically) deployed to IIS, the site does not run because it is missing DLLs that are created during the build process.
We have a Web Project created in VS2010. If I select "Build Deployment Package" from a right click in VS we get a zip file in the obj\Release\Package folder that contains the fully build site.
However, if ask our TFS build process to create the package by adding "/p:CreatePackageOnPublish=true /p:DeployOnBuild=true" to the MSBuild arguments (as advised in amongst other places here) we get an zip file in _PublishedWebsites\_Package\.zip that only contains the sources.
My best guess is that the CopyAllFilesToSingleFolderForPackage is picking up the files from the wrong place.
I notice a similar issue asked here - TFS 2010 and creating a package - although his workaround in not appropriate in many cases, I'd guess.
My concern is that this is using a built-in, but poorly documented feature of MSBuild/TFS so when it doesn't work you're a little in the wilderness.
It seems that deployOnBuild runs some "package"-like target on each of the projects. If you have built the projects into a separate directory (which the default TFS 2010 build does by default) the packaging won't pick up the compiled files.
One solution is to get rid of the custom output folder for the MSBuild Command within the TFS build workflow. This will cause the compiled files to be located in-situ and be included in the package.
Now the rest of the TFS workflow is require some changes because it'll be expecting to transfer the files from the output directory, and they won't be there.

How to efficiently do cross platform builds

I am setting up the build system for a team that produces APIs used on several platforms and architectures. There has been a lot of work already spent on setting up Ant to build all of the Java code, so I would prefer to stick with Ant if possible.
Where I am stumped is how to build the C++ software. Here are the platforms and languages I need to support:
Java - Linux - 32bit & 64bit: Ant
Java - Windows - 32bit & 64bit: Ant
C++ - Linux - 32bit & 64bit: Ant w/CppTasks (question #1)
C++ - Windows - 32bit: (question #2)
Note: C++ on Windows is MS Visual Studio C++ projects.
I think the answer to question #1 is CppTasks because that seems to be the standard way to build C++ from Ant.
For question #2, I could also use CppTasks, but the developers will be compiling in Visual Studio, so it seems beneficial to use their Visual Studio project for building, which means calling MSBuild from Ant.
Has anyone tried this before and has a good solution for building Java & C++ on both Linux and Windows?
Do you use a Continuous Build System like Jenkins?
With Jenkins, your builds can be automatically triggered by check in/commit, time of day, and/or on command. The great thing about Jenkins is that you can have it automatically build all of the various versions of your software.
For example, you'll probably need to run make on Linux C++ but use msbuild on Windows systems, and you'll need to trigger a build on a Linux machine and one for a Windows machine. Jenkins can be setup to do this automatically. When a commit happens, all your various builds on all of your systems can be triggered at once. Then, you can store the builds you need on Jenkins and have your users actually pull the type they need off the project they need.
There are so many ways this could be setup, but the easiest is to simply create four separate jobs (One for Java 32bit, Java 64bit, C++ Linux, and C++ Microsoft). You don't necessarily need a separate Microsoft Java build (at least in theory), but there's nothing stopping you.
You can have a single Jenkins server run "slave" jobs on other build systems, so you could have Jenkins live on the 64Bit Linux system, but use a 32bit Linux system as a slave to do the 32bit build, and call a Windows slave to do the Visual Basic build. That way, all of your jobs are located in a central place, but you can use the environments you want.
If you've never used a Continuous Build system, download Jenkins and play around with it. It's free and open source, and very, very easy to use. You can run it on any machine that has a JDK or JRE 1.6. If you download the Windows version, it even comes with the JRE already built in.
Your best bet is to use a continuous build system and allow it to handle the mess. By the way, there's also Bamboo, CruiseControl, and Hudson (which was split from Jenkins a few months ago)
TeamCity should fit the bill very well. It supports Ant and MSBuild natively and has a pretty good cross plartform story (written in Java but excellent integration with e.g. Win).
Dont see any benefit in wrapping you Win MSBuild-based builds in yet another build system.
The list for this looks a little bit different (in my opinion)
Java -Maven for all platforms
C++ - Maybe Maven as well (Check http://duns.github.com/maven-nar-plugin/).

Resources