How to call another FAKE build script from within a build script? - f#

I have a collection of repos under the same root directory. Each repo contains a build.fsx to compile & test etc.
I want to create one FAKE build.fsx in the root directory that can trigger the build.fsx scripts in the sibling repo directories.
I'm not worried about the loop, but how best to call another build.fsx from within the originating build.fsx?
I am sure Shell.Exec("./packages/tools/FAKE.exe", "./otherdir/build.fsx") would work, but is there a more seamless approach built into FAKE?

I don't think this is built into FAKE, so if you want the 'sibling' build scripts to be used independently of the main script then your Shell.Exec approach is likely a very good one (and what I would use). That said ...
A slight variation on that approach would be to load the sibling .fsx files and then compose their build targets in the 'parent' build script:
#load "Sibling.fsx"
"LocalTarget"
==> "SiblingTarget"
You will get errors if the target names conflict, and it could be confusing, so a naming convention would be smart (eg "Build_Sibling_1", "Clean_Sibling_2", etc).

Related

Bazel Starlark: how can I generate a BUILD file procedurally?

After downloading an archive throug http_archive I'd like to run a script to generate a BUILD file from the folder structure and Cmake files in it (I currently do that by hand and it is easy enough that it could be scripted). I don't find anything on how to open, read and write files in the starlark documentation but since http_archive itself is loaded from a bzl file (haven't found the source of that file yet though...) and generates BUILD files (by unpacking them from archives) I guess it must be possible to write a wrapper for http_archive that also generates the BUILD file?
This is a perfect use case for a custom repository rule. That lets you run arbitrary commands to generate the files for the repository, along with some helpers for common operations like downloading a file over HTTP using the repository cache (if configured). A repository rule conceptually similar to a normal rule, but with much less infrastructure because it's running during the loading phase when most of the Bazel infrastructure doesn't apply yet.
The starlark implementation of http_archive is in http.bzl. The core of it is a single call to ctx.download_and_extract. Your custom rule should do that too. http_archive then calls workspace_and_buildfile and patch from util.bzl, which do what they sound like. Instead of workspace_and_buildfile, you should call ctx.execute to run your command to generate the BUILD file. You could call patch if you want, or skip that functionality if you're not going to use it.
The repository_ctx page in the documentation is the top-level reference for everything your repository rule's implementation function can do, if you want to extend it further.
When using http_archive, you can use the build_file argument to create a BUILD file. To generate it dynamically, I think you can use the patch_cmds argument to run external commands.

Saving external dependencies to projects repository

"(new_)git_repository" and "(new_)http_archive" workspace rules deal with external projects in such way that any external dependency is copied to temporary directory linked to workspace as ${WORKSPACE}/bazel-workspace/external/${EXTERNAL_DEP_NAME} on build or prefetch.
I'd like to save external dependencies locally in my repo, so if remote repository vanishes i'd have copy of dependency even on a new machine, where it wasn't cached.
Can I somehow change default behaviour without writing custom workspace rule?
Bazel does have a flag you could use for this: --experimental_repository_cache. It is designed to be a system-wide cache so that multiple projects on one machine don't have to re-download dependencies, but you could use it per-repository. Basically you'd say:
bazel build --experimental_repository_cache=$PWD/my_cache //foo
Then all external repositories would be downloaded to the my_cache directory in your project.
This is a cache keyed by the hash of your external dependencies' content, so it's not going to be very human-readable, but it would let you keep your external dependencies in your VCS fairly easily.
(Theoretically you could even check in a .bazelrc file to specify this option by default, but --experimental_repository_cache only takes an absolute path right now, so it's a bit impractical. I filed a bug to handle the relative path use case.)
I might be wrong but it sounds like you want to just check it in the VCS. If we're talking about an http archive then download it manually, stick it under the relevant "third_party" sub folder with the BUILD file you craft for it and you're done.
If you want to use Bazel mechanisms to download and check-in the external dependencies then this isn't currently supported.
Maybe you should open an issue

TFS Online/VSO Build with Common Assemblies

I was wondering if anyone could help.We have the following project structure in our company :
Code/Common
Code/Project1
Code/Project2
etc...
When the Common Project builds, it has a PostBuild Event that copies all the relevant files into the Code/Common/Binaries folder. Then all the other Projects reference the Common components in this folder.
However, what we are struggling with is that when TFS Online checks-out the solution it does so to c:\a\src and the Common binaries are placed in c:\a\src\Binaries. Now, when the other projects (Project1 etc) do their build it cannot find the Common Assemblies, as not only are they removed, but the paths are different from what it expects them to be in c:\a\src\Common\Binaries instead of c:\a\src\Binaries.
Is there anyway to tell the build server to not delete those files in the "Binaries" directory and to specify the folder location to checkout to? Or how one one go about solving such a problem?
Thanks very much
A build server is a transient thing, you cannot rely on files to be there.
You need to either Create Nuget Packages for you common output and then consume these in your other projects (the 'proper' way), or you will need to check your dependencies into source control after each build so you can then reference them in subsequent builds (the 'really frowned apon' way).

TFS MSBuild Copy Files from Network Location Into Build Directory

We are using TFS to build our solutions. We have some help files that we don't include in our projects as we don't want to grant our document writer access to the source. These files are placed in a folder on our network.
When the build kicks off we want the process to grab the files from the network location and place them into a help folder that is part of source.
I have found an activity in the xaml for the build process called CopyDirectory. I think this may work but I'm not sure what values to place into the Destination and Source properties. After each successful build the build is copied out to a network location. We want to copy the files from one network location into the new build directory.
I may be approaching this the wrong way, but any help would be much appreciated.
Thanks.
First, you might want to consider your documentation author placing his documents in TFS. You can give him access to a separate folder or project without granting access to your source code. The advantages of this are:
Everything is in source control. Files dropped in a network folder are easily misplaced or corrupted, and you have no history of changes to them. The ideal for any project is that everything related to the project is captured in source control so you can lift out a complete historical version whenever one is needed.
You can map the documentation to a different local folder on your build server such that simply executing the "get" of the source code automatically copies the documentation exactly where it's needed.
The disadvantage is that you may need an extra CAL for him to be able to do this.
Another (more laborious) approach is to let him save to the network location, and have a developer check the new files into TFS periodically. If the docs aren't updated often this may be an acceptable compromise.
However, if you wish to copy the docs from the net during your build, you can use one of the MSBuild Copy commands (as you are already aware), or you can use Exec. The copy commands are more complicated to use because they often populated with filename lists that are generated from outputs of other build targets, and are usually used with solution-relative pathnames. But if you're happy with DOS commands (xcopy/robocopy), then you may find it much easier just to use Exec to run an xcopy/robocopy command. You can then "develop" and test the xcopy command outside the MSBuild environment and then just paste it into the MSBuild script with confidence that it will work - much easier than trialling copy settings as part of your full build process.
Exec is documented here. The example shows pretty well how to do what you want, but in your case you can probably just replace the Command attribute with the entire xcopy/robocopy command (or even the name of a batch file) you want to use, so you won't need to set up the ItemGroup etc.

TeamCity Artifacts and checkout rules and TFS (Oh my!)

Having real problems creating artifacts in teamcity 6.5 (using TFS & MSBuild as the buildrunner if it makes any odds, which it probably does as any examples I find seem to use SVN...).
The Build works, so long as I enter no checkout rules.
If I understand it, I'll need to set up some artifacts, that themselves rely on checkout rules(?).
I have two builds that are identical other than the way they are kicked off.
One is initiated on check-in
One is initiated manually from within TC. This build is the Test Build
Assembly version numbers come from a single versioninfo.cs file that is a linked file in all projects in the solution. This method is detailed here : http://www.codeproject.com/Articles/328977/The-Right-Way-to-Version-Your-Assemblies and holds the version number thus:
[assembly: AssemblyFileVersion("9.1.0.0")]
Ultimately, I'm unable to copy the output of the test build to another location.
As it stands, the only output of a build is in the teamcity data directory, for example :
C:\TeamCity\buildAgent\work\ceaaf65dc87ff856\Project1\bin\Debug
C:\TeamCity\buildAgent\work\ceaaf65dc87ff856\Project2\bin\Debug
etc
I'd like to copy the output files (exes and DLLs) to an output folder which has the build number of the build on it
For arguments sake, lets say for the version number above, this would be to
c:\BuildServer_Output\SolutionName\9.1.0.0
Currently I have not been able to create artifact paths that actually do anything - i.e. to copy anything anywhere.
For instance I have acoupe of artifact paths, but nothing ever gets put into C:\BuildServer_TestBuilds -
+:Accounts\bin\debug* => C:\BuildServer_TestBuilds
+:BackOffice\bin\debug* => C:\BuildServer_TestBuilds
Am I getting no artifacts (and my artifact paths therefore ignored) because I have no checkout rules?
Any help would be appreciated.
I am pretty sure artifacts and checkout rules are completely independent. Artifacts just deal with what has been built. Checkout rules tell teamcity how to react to and checkout changes in the VCS.
It looks like your artifact paths are beginning with absolute paths. I have always found it easier to use relative paths with wildcards. That way I don't need to worry about where teamcity put the build. We use the following to get all dlls and exes to one folder
**\bin\Debug\*.*=>deploymentdir
Our build configuration page has an artifacts link and when we open it it will have things like
deploymentdir\common\bin\debug\common.dll
deploymentdir\common\bin\debug\common.pdb
deploymentdir\runner\bin\debug\runner.exe
In one of our other builds we use an msbuild script to flatten our output before putting it through the artifact process.
We do use checkout rules but we have not had to change our artifact paths to accommodate them.

Resources