I like the way F# requires to organize files and code in order of dependency because it discourages mindless coupling.
I have flat list of source files atm (simplified):
Common.fs
Workflow1.fs
Workflow2.fs
And want to go one step beyond. First, organize files like this (VS 2017 can't move folders up and down, one needs to edit .fsproj but it is different story - at least it is possible):
Common.fs
Workflow1\Impl1.fs
Workflow2\Impl2.fs
I expected Impl1.fs and Impl2.fs to be fully isolated from each other because their folders are not in parent/child relationship, but Impl2.fs can easily see types and functions from Impl1.fs: just open module and they are available
Are there any techniques to keep them isolated? It's possible to split the project in three however I prefer to keep DLL as a unit of deployment: workflows are small and ideally I want to avoid to have many tiny DLL files.
It sounds like you have a good understanding of what your options are. You are not missing anything else. Consequently, the answer to your question is, "no."
You can't isolate the folders that way, as far as I know. However, if you want to keep them isolated at compile time but deployable as a single unit, you can create separate projects and use Fody or ILMerge to combine the assemblies post-compilation.
Related
I have a large application to manage consisting of of three or four executables and as many as fifty .dlls. Many of the source code files are shared across many of the projects.
The problem is a familiar one to many of us - if I change some source code I want to be able to identify which of the binaries will change and, therefore, what it is appropriate to retest.
A simple approach would be simply to compare file sizes. That is an 80% acceptable solution, but there is at least a theoretical possibility of missing something. Secondly, it gives me very little indication as to WHAT has changed; It would be ideal to get some form of report on this so I can then filter out irrelevant (e.g. dates/versions copyrights etc..)
On the plus side :
all my .dcus are in a row - I mean they are all built into a single folder
the build is controlled by a script (.bat)(easy, for example, to emit .obj files if that helps)
svn makes it easy to collect together any (two) revisions for comparison
On the minus side
There is no policy to include all used units in all projects; some units get included because they are on a search path.
Just knowing that a changed unit is used/compiled by a project is not sufficient proof that the binary is affected.
Before I begin writing some code to solve the problem I would like to ask the panel what suggestions they might have as to how to approach this.
The rules of StackOverflow forbid me to ask for recommended software, but if anyone has any positive experiences of continuous integration tools that would help - great
I am open to any suggestion or observation that is relevant in this context.
It seems to me that your question boils down to knowing which units are contained in your various executables. Since you are using search paths, it will be hard for you to work this out ahead of time. The most robust way to find out is to consult the .map file that the compiler emits. This contains a list of all units contained in your executable.
Once you know which units are contained in each executable, you need to know whether or not anything has changed in those units. That information is contained in your revision control system. Put this all together and you have the information that you need.
Of course, just because the source code for a unit has changed, you might argue that re-testing is not needed. Perhaps the only change made was the version, or the date in a copyright label or some such. But it is asking too much to be able to ask a computer to make such a judgement. At some point you need a human to step up and take responsibility.
What is odd about this though is that you are asking the question at all. It seems to me to be enormously risky to attempt partial testing. I cannot understand why you don't simply retest the entire product.
After using it for > 10 years for commercial in-house and freelancer work in large projects, I can recommend to try Apache Ant. It is a build tool which supports dependencies, and has many very helpful features.
Apache Ant also integrates nicely with CI tools such as Hudson/Jenkins, Bamboo etc.
Another suggestion - based on experience with Maven - is to design the general software architecture as modular as possible. If modules (single or multiple source or DCU files in one directory) use a version number in the directory name as a version number, it is possible to control exactly how application are composed from these modules.
If you want to program such a tool yourself the approach would be something like this:
First you need to detect wheter there were any changes made to seperate source files. As you already figured out comparing the file size is bad idea as the file size can stay the same despite lots of changes made to it (as long as there is same amount of text in pas file its size won't change). So instead you could check the last modification time for specific file or create some hash value like MD5 hash for comparison (can be quite slow).
Then you need to generate yourself a dependancy tree which will tell you which files are used for which project/subproject.
Finally based on changes detected in seperate files you check the dependancy tree to see which projects needs to be recompiled.
The problem of such approach is that you would probably have to update the dependancy tree manually each time when new unit is added to the project or an existing one is removed from the project.
But the best way would be to go and use some version controll software istead of reinventing the wheel. I myself like the way how GIT works and I belive that with proper implementation of GIT into the project mannager itself could be quite powerfull do to GIT support of branching/subbranching (each project is its own branch, each version of your software can be its own subbranch).
Now latest version of Delphi does have GIT integration done though SVN but this unfortunately limits some of best GIT functionality. So if you maybe decide to go and integrate GIT support directly into Delphi I'm first in line to use it.
this is a continuation of the discussion I started here. I would like to find the best way to modularize Delphi source code as I'm not experienced on this field. I will be gratefull for all your suggestions.
Let me post what I have already written there.
The software developed by the company I work for consists of more than 100 modules (most of them being something like drivers for different devices). Most of them share the same code - in most cases classes. The problem is that those classes are not always put into separate, standalone PAS units. I mean that the shared code is often put into units containing code specific to a module. This means that when you fix a bug in a shared class, it is not enough to copy the PAS unit it is defined in into all software modules and recompile them. Unfortunately, you have to copy and paste the fixed pieces of code into each module, one by one, into a proper unit and class. This takes a lot of time and this is what I would like to eliminate in the nearest future by choosing a correct approach - please help me.
I thought that using BPLs distributed with EXEs would be a good solution, but it has some downsides, as some mentioned during the previous discussion. The worst problem is that if each EXE needs several BPLs, our technical support people will have to know which EXE needs which BPLs and then provide end users with proper files. As long as we don't have a software updater, this will be a great deal for both our technicians and end users. They will certainly get lost and angry :-/.
Also compatibility issues may occur - if one BPL is shared by many EXEs, a modification of that BPL can bee good for one EXE and bad for some other ones.
What should I do then to make bug fixes quicker in so many projects? I think of one of the following approaches. If you have better ideas, please let me know.
Put shared code into separate and standalone PAS units, so when there is a bug fix in one of them, it is enough to copy it to all projects (overwrite the old files) and recompile all of them. This means that each unit is copied as many times as many projects it is used by.
This solution seems to be OK as far as a rarely modified code is concerned. But we also have pas units with general use functions and procedures, which often undergo modifications. It would be impossible to do the same procedure (of copying and recompiling so many projects) every time someone adds a new function to this file.
Create BPLs for all the shared code, but link them into EXEs, so that EXEs are standalone.
For me it seems the best solution now, but there are some cons. If I make a bug fix in a BPL, each programmer will have to update the BPL on their computer. What if they forget to do that? However, I think it is a minor problem. If we take care of informing each other about changes, everything should be fine. What do you think?
And the last idea, suggested by CodeInChaos (I don't know if I understood it properly). Sharing PAS files between projects. It probably means that we would have to store shared code in a separate folder and make all projects search for that code there, right? And whenever it is necessary to modify a project, it would have to be downloaded from SVN together with the shared files folder, I guess. Each change in the shared code would have to cause recompilation of each project that uses that code.
Please help me choose a good solution. I just don't want the company to lose much more time and money than necessary on bugfixes, just because of a stupid approach to software development. So far nobody has cared about it and you can imagine how many problems it causes.
Thank you very much.
You say:
Create BPLs for all the shared code, but link them into EXEs, so
that EXEs are standalone.
You can't link BPLs into an executable. You are simply linking in the separate units that are also in the BPL. That way you don't actually use or even need the BPL at all.
BPLs are meant to be used as shared code, i.e. you put the code that is shared into one or several BPLs and use that from each of the .exes, .dlls or other .bpls. Bugfixes (if they don't change the public interface of the BPL) merely require the redistribution of that one fixed BPL.
As I said, decide on the public interface of a DLL and then don't change it. You can add routines, types and classes, but you should not modify the public interfaces of any existing classes, types, interfaces, constants, global variables, etc. that are already in use. That way, a fixed version of the BPL can easily be distributed.
But note that BPLs are highly compiler version dependent. If you use a new version of the compiler, you will have to recompile the BPL too. That is why it makes sense to give BPLs suffixes like 100, 110, etc., depending on the compiler version. An executable compiled with compiler version 15.0 will then be told to use the BPL with suffix 150, and an executable compiled with version 14.0 will use the BPL with suffix 140. That way, different versions of the BPLs can peacefully co-exist. The suffix can be set in the project options.
How do you manage different versions? Make a directory with a structure like I have for my ComponentInstaller BPL (this is the expert you can see in the Delphi/C++Builder/RAD Studio XE IDE under menu Components -> Install Component):
Projects
ComponentInstaller
Common
D2007
D2009
D2010
DXE
The Common directory contains the .pas files and resources (bitmaps, etc.) shared by each version, and each of the Dxxxx directories contains the .dpk, .dproj, etc. for that particular version of the BPL. Each of the packages uses the files in the Common directory. This can of course be done for several BPLs at once.
A versioning system might make this a lot easier, BTW. Just be sure to give each version of the BPL a different suffix.
If you actually want standalone executables, you don't use BPLs and simply link in the separate units. The option "compile with BPLs" governs this.
From my point of view trying to manage artifacts like Delphi units, libraries and executable files, you search at wrong place. I suggest you to turn around and start with refactoring of code, based on Design patterns implementation.
E.g. all common functions can be placed into one Singleton class, instances of common classes can be constructed with Abstract Factory, classes can interact through native Delphi implementation of interfaces instead of direct usage and so on. Even you can choose to implement Facade for all common parts of projects.
Of course, concrete choice of patterns and details of implementation depends on project specific and only you can decide what applicable in your case.
I suppose, that after looking to project in this vein you can find more natural ways of code organization and solution for your problems.
Some other things:
Of course, you must follow #CodeInChaos suggestion and share one copy of source file between all projects instead of copying it to each project manually. It may be useful if you adopt some standard for building environment, which will be mandatory for all developers (same folder structure, location of libraries, environment settings).
Try to analyze building and deployment process: for me it's looking abnormal when solution not built with latest version of code and not tested before deployment. (it's for your "If I make a bug fix in a BPL, each programmer ..." phrase).
Variant with standalone executable files looks better because significantly simplifies organization of testing environment and project deployment. Just choose adequate conventions for versioning.
This question is similar to this one, but not a duplicate because I'm asking about issues not discussed in that question.
I have a client-server project in Delphi 7 with the following directory structure:
\MyApp
\MyClientApp
\MyServerApp
\lib
There are 2 actual Delphi projects (.dpr), one each in the MyClientApp and MyServerApp folders.
The lib folder has .pas units that have common code to the client and server apps. What I'm wondering is if I should include those .pas files in the client and server projects? Or should I create a package in the lib folder which includes those units? Or should I just leave the .pas files sitting in the lib folder and not add them to any app/package?
What are the pros/cons of each approach? Which way is "best"? Is there any issue with having those units from the lib folder be included in more than one project?
Right now the units in the lib folder are not a part of any app/package. One disadvantage of this is that when I have my client app open in Delphi, for example, and I want to search in all files in the project for something, it doesn't also search in the units in the lib folder. I get around this by opening those units and doing a find in all open files, or using grep search (but I'd prefer a better solution).
I would also greatly prefer a solution where I will not have to go and open some separate package and recompile it when I make changes to those files in the lib folder (is this where I should use a project group?).
Sharing units between applications always carries the risk of incompatible changes done in one application that breaks the other. On the other hand, making copies of these units is even worse, so your approcach of moving them to their own subdirectory at least adds a psychological barrier to changing them without considering other programs.
As for adding them to the project files: I usually add some units which I frequently access (either for expanding or for reference) from the IDE to the project, and leave others out for the compiler to pick using the search path. I do that on per project basis, that means, some units may be part of several projects, why not?
Putting them into a package only makes sense, if you actually want to create a package based application, otherwise, why bother?
For more info on how I organize my projects and libraries, see http://www.dummzeuch.de/delphi/subversion/english.html
I dislike having files shared by projects. All too often, you'll be tempted to edit one of the shared files, and you'll either break something in the other project, or you'll forget that you have to rebuild the other project at all.
When the shared files are instead separated into their own library (package), then there's a little extra barrier to editing them. I consider that a good thing. It will be a light reminder that you're switching from project-specific code to shared code. You can use project groups to let you keep every together in a single IDE instance. arrange the library projects ahead of the executable projects. The "build all" command will build everything in order, starting with the first project.
Keep your DCU files separate from your PAS files. You can do this easily by setting the "DCU output directory" project option to send your package's units to some other location. Then put that destination directory on your other projects' "search path." They'll find the DCU, but they won't find the PAS file, and so no other project will accidentally recompile a unit that isn't really a member.
Having a separate package also discourages use of project-specific conditional defines. Those cause all sorts of trouble when you're sharing units between projects. Find a way to instead keep all project-specific options within the respective projects. A shared library shouldn't require project-specific modifications. If a library needs to act differently based on who's using it, then employ techniques like callback functions that the library user can set to modify the library's behavior.
I would need to have a very good reason to add shared code to a package. If you just have a few shared files stick them all in a directory called Shared. This should make it obvious the files are shared between projects.
Also use a good build tool to do automated builds so you will find out soon enough if you break something.
.bpl files are fine for components, but bring in serious added complexity for things like this, unless you have a huge amount of shared files.
I usually create a package with all shared unit, and just use the units.
If you do not explicitly mark "Build with run time packages" the package content (all used dcu's) will be linked to your project as any other unit.
I would only use runtime packages if you actually had two binaries that were supposed to run on the same physical machine and that shared some code. Keep in mind that runtime packages are mostly an all-or-nothing approach. Once you decide to use them you will also no longer be able to link the RTL and VCL units straight into your projects and will instead have to deploy those separately as well.
However, packages might still be a good solution to your problem when combined with project groups which is exactly what I'm doing. I hate having shared units included in multiple projects. Including the shared units in a package (but not compiling your actual projects with runtime packages) allows you to add that package to your project group so you (and the IDE!) will always have them easily accessible yet nicely separated from the project-specific code. Strictly speaking you don't even ever have to compile those packages. They can merely serve as an organisational unit in the project manager.
Note that for the Find in Files, you can also specify "in all files in project group"
We create via "Tools | Options | Environment Variables" Variables like that:
$(Sources) = D:\Sources\Delphi
$(OurLib) = $(Sources)\OurLib\Src
$(OurApp1) = $(Sources)\Applications\App1\3.x
$(ThirdParty) = $(Sources)\ThirdPartyComponents
We use these Variables in the project search path like that:
($OurApp1)\Src\Core;($OurApp1)\Src\GUI;($OurApp1)\Src\Plugins;$(ThirdParty)\JVCL
But this is broken (meanwhile fixed) since Delphi 2009 as these variables are not evaluated completely anymore (see QC #73276). So the files in the directories are not found by the compiler. A workaround: Use only complete directories in the environment variables.
We use this approach because on all developer machines and the build servers the files can be found and we only have to point $(Sources) to the right place.
We don't have anything in our global library path (except the Delphi defaults), because that wouldn't be in the version control and isn't reflected on other developers or build machines.
One problem is: If one unit in $(OurLib) decides to include another new unit maybe in a new path, all projects break because they don't find this new unit. Then we have to go through all projects and add the search path. (BTW: I really hate the search path editor...wouldn't be a simple memo field much better to edit than this replace/add/delete logic?)
Another thing we do is not adding many units to our project. Especially everything from $(OurLib), but we often have units like plugins which add functionality only by including them. For different editions of our products, we want to include different units. As Delphi always messes up $IFDEFs in the uses clause in the .dpr we help us by including units named like "IncludePlugins" which then include the units depending on IFDEFs.
But not including units in the project makes navigating to a pain. The units don't appear in the project, they are not found by Ctrl+12 (Show Units), they are not shown in code completion etc.
Has anybody a better way to cope with these problems?
We use only relative paths, any libraries are always below the libs subdirectory while the project source code resides in the src subdir. So our search paths always look like:
..\libs\library1;..\libs\library2\common;
etc.
All libraries are added as svn:external to each project, so checking out the project will automatically check out the libraries as well and the search path will always point to the correct version of the library for that project.
Not perfect, but it works most of the time.
I have to agree about the search path editor, it is even worse for relative paths because you must not use the "..." buttons otherwise Delphi will insert an absolute path.
We use standard drive mappings.
Our current project is always on W: regardless if it is a network drive or a substitute.
This works great.
When you need to work on a different project, swap the W: and you can continue.
You can copy the search path out to an editor, modify it and then copy it back.
Your search path is much too big. It should contain only the things you want Delphi to recompile with your project. You don't really want to recompile the Jedi VCL every day, do you?
I create a single directory where all compiled units go. Say, C:\dcu. Specify that as the "unit output directory" in all packages. My "search path," then, is always just this:
$(Delphi)\Lib;C:\dcu
The compiler finds everything it needs, and it never finds any source code. The only source code it ever sees is in the files that directly belong to whatever project I'm compiling. The project's own source directories don't need to be on the search path because all of those files are already direct members of the project. The compiler knows exactly where they are.
For me, all a project's source files go in a single directory. If you want separate directories for different parts, like Core and GUI, then I would put those in separate packages so I could work on them and compile them separately. Even if the final program doesn't use the resultant BPLs, packages are still a good way of segmenting your project and defining dependencies.
When compiling units for one project doesn't automatically compile units for all the other projects, you're forced to change active projects. It takes a moment of your time, but it also serves as a mental reminder that you're "changing hats," too.
Although you're producing just one product, that doesn't mean you should have just one project in Delphi. You should have at least one project for each executable module (EXE, DLL, BPL) in your product. Use project groups to manage multiple projects in a single IDE session. No unit should be a member of more than one project.
I don't understand your part about plug-ins and different editions of your project. When you say "plug-in," I assume you're talking about separate executable modules, like DLLs or packages, that the customer can choose to include or not. Couldn't you turn your different editions' features into plug-in modules that simply don't include in the lesser editions? Then you don't have to worry about conditional compilation of your project; just have several different installer packagers that grab different sets of plug-ins.
I have always found it odd that this has never been addressed adequately. I suggested recently to David I that Delphi should allow the user to set up some sort of preferred development structure and that third party library publishers could be made aware of this so that they could automatically adjust their installers to install correctly in the preferred development framework. If the preferred development structure was stored in an XML file or similar, then, it could be copied from one computer to another on a development team.
As an alternative, it could make an interesting project to create a Delphi application that would allow a user to "refactor" their library installation in a high level way. You specify which folders on your system contain source or compiled components or whatever and where you want to keep source files or compiled units, hit Go and your system gets rearranged for you, while updating your Delphi environment so that when you start Delphi, it finds everything it should.
I've just recently discovered a way to have project specific environment variables in delphi builds using XE6, it's not quite as good as a full blown #define like in C but at least I can now have consistent search paths across multiple projects and create some shared option sets.
What I've done is setup environment variables in the same manner as the original poster and then override them in the dproj or optionset.
The BuildPaths.optset added to the project looks like
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<SVN_Root>..\..\..</SVN_Root>
<SVN_Riemann>$(SVN_Root)\Riemann</SVN_Riemann>
<SVN_Library>$(SVN_Root)\Library</SVN_Library>
<SVN_ThirdParty>$(SVN_Library)\Third Party</SVN_ThirdParty>
</PropertyGroup>
<ProjectExtensions>
<Borland.Personality>Delphi.Personality.12</Borland.Personality>
<Borland.ProjectType>OptionSet</Borland.ProjectType>
<BorlandProject>
<Delphi.Personality/>
</BorlandProject>
<ProjectFileVersion>12</ProjectFileVersion>
</ProjectExtensions>
</Project>
Maybe this applied to other Delphi's (I've only used 7). We've got our code broken up so that nearly every DLL in our fairly massive app is in a different folder.
99% of the open source stuff I've downloaded to plug into Delphi have had all their source munged into one folder.
It seems like this was an assumption that the developers of Delphi made about the coding practices of it's users that may be non-obvious.
I don't think so. In fact, In more recent versions they've added features to the project manager to make it easier to deal with the fact that code is spread around different directories (such as the flatten directories option), so I think it is accepted that this is how many people organize their code.
I suspect it's more to do with projects growing organically over time, and whether anyone takes the time to tidy up.
I for one definitely do not put all the sources into one directory but rather keep them in groups that have something in common. e.g. I use subversion externals quite extensively
(see http://www.dummzeuch.de/delphi/subversion/english.html , the section about externals).
I prefer different modules to be hosted on different folders, then have a common folder for units that is shared among different modules, makes management easy. e.g
myClientServerApp:(parent)
Client folder :(child)
server filder (child)
lib - (child)
Back in DELPHI 7 I also had all files in one folder. It has easy for small projects, but very hard for med to big one.
So I began to create a folder structure for all DELPHI projects small or big.
Over the year I am trying to improve, this folder structure, and every new project I make a small improvement so that it is simpler, logical, and more organized.
This day I am trying to make some parts of it sharable to several project. Its work in progress.
It would seem that having all the units in one folder would save you headaches in doubly named units. On the other hand, it might be handier to keep your projects in different folders when checking in and out of your version control. On the other hand it really doesn't promote code reuse to have them separated out like that.