build .net class library into multiple files - libraries

i have a class library project that i dont want to split into more than one project, however i would like to compile and build it to multiple dll files instead of one big file. how could i do this?

This is called a MultiModule Assembly. This involves writing a custom Makefile and performing the build using nmake.
The Multi-Module Assemblies article on eTutorials is very clear and straightforward with instructions on how to accomplish this.

Write your own compiler? I don't think this is possible...
You could have a post-build step to copy and rename your dll so you would have two different dlls, but then they'd just be the same...
How would you specify which files got build into which dll?
That is kind of the whole point of having separate projects.
You can have the projects reference each other so they can use the code from the other dlls.

Related

Using large non-bazel dependencies in a bazel project

I would like to use a very large non-bazel system in a bazel project. Specifically, ROS2. This dependency provides a large number of python, C, and C++ libraries which are built using its own hand-rolled buildsystem. Obviously, I would like to avoid having to translate the entire buildsystem over to bazel.
Broadly, what's the best way of me doing this? In instinct was to use a custom repository rule to download the source (since it's split across many repositories), then use a genrule to call the ROS2 build system. Then write my simple cc_import and py_library rules for each of the individual components that I need.
However, I'm having trouble with the bit where I need to call the foreign build system. It seems that genrules require a list of output files to be specified, while I would like it to make an entire build directory available.
Before I spent any more time on this, I thought I'd ask whether I'm on the right lines since I'm new to bazel. Is this a good strategy? How would you approach this problem? Are there any other projects that mainly use bazel, but call other build systems in this way that I can look at?
As of recent, you can use rules_foreign_cc to call native CMake or make/configure like projects.

TFS Build custom activity requiring more assemblies than needed

I've just written the first version of a workflow activity that will run Resharper's Code Issues on the projects and parse the output to display the issues as build warnings and errors.
At first, I was going to just call Resharper's command line and parse the resulting xml manually. After fiddling with the dlls in Resharper's SDK (through disassembly mostly), I found a way to parse the results using it's own public classes, which I figured was a much more elegant and safe way to do this.
The first problem I have is that that nuget package is absolutely huge. There is 140mb of files in there, which to me is absurd for a single, unpartitioned package. There seems to be such heavy coupling between them that by using just a few model classes and the parser class, I have to drag a dozen or so of those dlls along, some of them which seemingly have nothing to do with the main dlls I need. This is not a show stopper though, I'm struggling with something else now:
In the end, I managed to track down the dependencies I needed to 41 assemblies (which is, again, insane, but alas). Initially, I tried removing everything and adding the missing references one by one, but this turned out to be unreliable, still missing some indirect references, even after compiling successfully. Then, I decided to code a small console application to find all referenced assemblies in the main Resharper assemblies I used, which gave me the 41 references I mentioned. This is the code I used to find every dependency.
Since these are custom activities we are talking about, I decided to create a unit test project to validate them. Using these 41 references only, everything works correctly.
When I added the activity to the build workflow though, and pointed the build controller to the source control folder containing the required assemblies, every time I schedule a build, the process fails stating that I need one extra dll from Resharper's SDK. For example, this is the first one it asks:
Could not load file or assembly 'AsyncBridge.Net35, PublicKeyToken=b3b1c0202c0d6a87' or one of its dependencies. The system cannot find the file specified. (type FileNotFoundException)
When I add this specific assembly to the TFS folder, I get another similar error for another dll, and this keeps going on and on.
What I wanted to know is how can I know exactly which assemblies a workflow XAML will need in order to run correctly? My custom activity dll has two specific CodeActivities and a XAML only activity that uses these two. This XAML acticity is what I'm directly using in the modified workflow template.
I see that besides the references in my project, the XAML activity also contains a TextExpression.ReferencesForImplementation section, with some assembly names. I've run my dependency finder program on those dependencies too, and the results are the same 41 assemblies already at the TFS folder.
Meanwhile I'll go with having the whole SDK into the custom assemblies folder, but I would really like to avoid this in the future since it has such an enormous amount of unneeded and big dlls in there.
First, we have request for our command line tool to support workflow activity and we decided to implement just plain MsBuild task which is universal and works in TFS too. Task and targets files are included in ReSharper CLT 8.2.
Second, if you still want to implement workflow activity it's pretty easy to do with new API in CLT, designed specially for custom processing of found issues - http://confluence.jetbrains.com/display/NETCOM/Custom+InspectCode+Issue+Logger.
And last, but not least, you do not need to put in VCS binaries of ReSharper SDK package.
Use NuGet's restore package functionality.
If you have any other questions I'll be glad to answer them.
A custom activity is being load and run by .NET CLR like any other .NET program. If the stack trace reports a missing file, then it's required by the CLR and you can't change this fact without refactoring your code.
Having an entire SDK references in the custom assembly folder doesn't make sense. I would prefer GAC deployment over huge binaries folder in the source control. Or maybe consider having these activities running an pre\post build scripts in MSBuild or PowerShell.

What exacly is "buidling" from source and how does it work

So I really cant understand how this work but late me explain. First, just in case you need it, I am running Ubuntu 12.04 64-bit on a laptop.
As a building tool am using CMake. I want to load in to my project OpenCV, MRPT (http://www.mrpt.org/) and libfreenect. All of them have a "source code". What I don't understand is when they say "build from source". How to I make a project with all of them?
Do I need to build each one individually and with some way but then in my project OR do I down load the source code and build them all together at ones? As you can see I'm really confused what I have to do... do I run the CMakeList.txt from each source code and the run one CMakeList.txt that has all the other CMakeList.txt?
In fewer world, if I want to build from source, two or more libraries, how do I do that?
I would like a general answer (how this "build from source" works) and an answer specifically on the the ones I mentioned (CMake, OpenCV, MRPT, libfreenect). I hope I made clear what I don't really understand.
It depends of the 'master' project. In general in the c/c++ universe your project must know how to invoke the build process of each subproject/library OR your project needs to know how to include&link the results after building each external project yourself.
You can also mix the two approaches if needed but I think it cleaner to try to use one if possible.
In the first case if all the subprojects offer cmake building files (CMakeLists.txt) you may try to add_subdirectory() each and see if there are any conflicts. For example google test can be easily included this way and it gives your project some global variables that easy linking later.
Alternatively or if the above approach gives problems or the sub project doesn't provide CMakeLists.txt you can use ExternalProject_add(). It takes more work and you have to handle includes/linking configurations with your project manually but it makes the subproject more independent. For example if there are conflicting targets with your project or the subproject doesn't provide CMakeLists.txt.
The last approach involves building and installing the sub projects separately, using configuration variables in your project to point the includes/libraries paths of the sub project. Check CMake:How To Find Libraries for details.

In Delphi, should I add shared units to my projects, to a shared package, or neither?

This question is similar to this one, but not a duplicate because I'm asking about issues not discussed in that question.
I have a client-server project in Delphi 7 with the following directory structure:
\MyApp
\MyClientApp
\MyServerApp
\lib
There are 2 actual Delphi projects (.dpr), one each in the MyClientApp and MyServerApp folders.
The lib folder has .pas units that have common code to the client and server apps. What I'm wondering is if I should include those .pas files in the client and server projects? Or should I create a package in the lib folder which includes those units? Or should I just leave the .pas files sitting in the lib folder and not add them to any app/package?
What are the pros/cons of each approach? Which way is "best"? Is there any issue with having those units from the lib folder be included in more than one project?
Right now the units in the lib folder are not a part of any app/package. One disadvantage of this is that when I have my client app open in Delphi, for example, and I want to search in all files in the project for something, it doesn't also search in the units in the lib folder. I get around this by opening those units and doing a find in all open files, or using grep search (but I'd prefer a better solution).
I would also greatly prefer a solution where I will not have to go and open some separate package and recompile it when I make changes to those files in the lib folder (is this where I should use a project group?).
Sharing units between applications always carries the risk of incompatible changes done in one application that breaks the other. On the other hand, making copies of these units is even worse, so your approcach of moving them to their own subdirectory at least adds a psychological barrier to changing them without considering other programs.
As for adding them to the project files: I usually add some units which I frequently access (either for expanding or for reference) from the IDE to the project, and leave others out for the compiler to pick using the search path. I do that on per project basis, that means, some units may be part of several projects, why not?
Putting them into a package only makes sense, if you actually want to create a package based application, otherwise, why bother?
For more info on how I organize my projects and libraries, see http://www.dummzeuch.de/delphi/subversion/english.html
I dislike having files shared by projects. All too often, you'll be tempted to edit one of the shared files, and you'll either break something in the other project, or you'll forget that you have to rebuild the other project at all.
When the shared files are instead separated into their own library (package), then there's a little extra barrier to editing them. I consider that a good thing. It will be a light reminder that you're switching from project-specific code to shared code. You can use project groups to let you keep every together in a single IDE instance. arrange the library projects ahead of the executable projects. The "build all" command will build everything in order, starting with the first project.
Keep your DCU files separate from your PAS files. You can do this easily by setting the "DCU output directory" project option to send your package's units to some other location. Then put that destination directory on your other projects' "search path." They'll find the DCU, but they won't find the PAS file, and so no other project will accidentally recompile a unit that isn't really a member.
Having a separate package also discourages use of project-specific conditional defines. Those cause all sorts of trouble when you're sharing units between projects. Find a way to instead keep all project-specific options within the respective projects. A shared library shouldn't require project-specific modifications. If a library needs to act differently based on who's using it, then employ techniques like callback functions that the library user can set to modify the library's behavior.
I would need to have a very good reason to add shared code to a package. If you just have a few shared files stick them all in a directory called Shared. This should make it obvious the files are shared between projects.
Also use a good build tool to do automated builds so you will find out soon enough if you break something.
.bpl files are fine for components, but bring in serious added complexity for things like this, unless you have a huge amount of shared files.
I usually create a package with all shared unit, and just use the units.
If you do not explicitly mark "Build with run time packages" the package content (all used dcu's) will be linked to your project as any other unit.
I would only use runtime packages if you actually had two binaries that were supposed to run on the same physical machine and that shared some code. Keep in mind that runtime packages are mostly an all-or-nothing approach. Once you decide to use them you will also no longer be able to link the RTL and VCL units straight into your projects and will instead have to deploy those separately as well.
However, packages might still be a good solution to your problem when combined with project groups which is exactly what I'm doing. I hate having shared units included in multiple projects. Including the shared units in a package (but not compiling your actual projects with runtime packages) allows you to add that package to your project group so you (and the IDE!) will always have them easily accessible yet nicely separated from the project-specific code. Strictly speaking you don't even ever have to compile those packages. They can merely serve as an organisational unit in the project manager.
Note that for the Find in Files, you can also specify "in all files in project group"

Managing/Using libraries with Debug builds vs Release builds

I'm curious about everyones practices when it comes to using or distributing libraries for an application that you write.
First of all, when developing your application do you link the debug or release version of the libraries? (For when you run your application in debug mode)
Then when you run your app in release mode just before deploying, which build of the libraries do you use?
How do you perform the switch between your debug and release version of the libraries? Do you do it manually, do you use macros, or whatever else is it that you do?
I would first determine what requirements are needed from the library:
Debug/Release
Unicode support
And so on..
With that determined you can then create configurations for each combination required by yourself or other library users.
When compiling and linking it is very important that you keep that libraries and executable consistent with respect to configurations used i.e. don't mix release & debug when linking.
I know on the Windows/VS platform this can cause subtle memory issues if debug & release libs are mixed within an executable.
As Brian has mentioned to Visual Studio it's best to use the Configuration Manager to setup how you want each configuration you require to be built.
For example our projects require the following configurations to be available depending on the executable being built.
Debug+Unicode
Debug+ASCII
Release+Unicode
Release+ASCII
The users of this particular project use the Configuration Manager to match their executable requirements with the project's available configurations.
Regarding the use of macros, they are used extensively in implementing compile time decisions for requirements like if the debug or release version of a function is to be linked. If you're using VS you can view the pre-processor definitions attribute to see how the various macros are defined e.g. _DEBUG _RELEASE, this is how the configuration controls whats compiled.
What platform are you using to compile/link your projects?
EDIT: Expanding on your updated comment..
If the Configuration Manager option is not available to you then I recommend using the following properties from the project:
Linker->Additional Library Directories or Linker->Input
Use the macro $(ConfigurationName) to link with the appropriate library configuration e.g. Debug/Release.
$(ProjectDir)\..\third-party-prj\$(ConfigurationName)\third-party.lib
Build Events or Custom Build Step configuration property
Execute a copy of the required library file(s) from the dependent project prior (or after) to the build occurring.
xcopy $(ProjectDir)\..\third-party-prj\$(ConfigurationName)\third-party.dll $(IntDir)
The macro $(ProjectDir) will be substituted for the current project's location and causes the operation to occur relative to the current project.
The macro $(ConfigurationName) will be substituted for the currently selected configuration (default is Debug or Release) which allows the correct items to be copied depending on what configuration is being built currently.
If you use a regular naming convention for your project configurations it will help, as you can use the $(ConfigurationName) macro, otherwise you can simply use a fixed string.
I use VS. The way that I do it is that the libraries I need through the references of the project. Which basically just says in what folder to look for a specific library at project load time. I develop my libraries to be as project independent or reusable as possible. Therefore they are all projects of their own. So of the libraries that I need for a specific project, I create a "3rdParty" or "libs" folder at the same level as my "src" folder in my svn folder tree. I tend to only use released libraries, but when I get some unknown issues and want to switch to debug, I manually copy a debug version of the files in the "lib" folder and reload the project.
I am unsure wether I should be keeping both debug and released versions in my svn tree. Although since they are projects of their own, keeping them in the svn tree of another project doesn't right. They can be built again without an hitch at any moment.
And then I wanted to find a way of making the switch more...hmmm...well basically automatic if you while, but that's not what I really mean. It just feels that switching the files manually between released and debug isn't right. Maybe I haven't found it yet, but what I would like is an option that would do like:
For library "stack.dll" look in "......\3rdParty\" for release and "......\3rdPartyD\" for debug.
Anything that those something like I don't know. What do you suggest?
Remember libraries are external projects. There the built files are totally elsewhere. In fact think of it as you have to check out another project, build it, and copy the built library if you want another copy. How would you set that up?

Resources