First of all, please forgive me for not knowing the proper terminology, I'm sure there's a very common technical name for this which I could simply Google for help - but I can't find help if I don't know the term to begin with.
I'm building a modular system in Delphi 7. There are a few applications and a bunch of DLL's. All the applications share these DLL's, as well as some of the DLL's use other DLL's. The DLL's are currently saved in the same location as the application. I would instead like to put all these DLL's in a sub-folder (elsewhere from the EXE's) but of course Delphi won't know how to find them.
Is there a way I can direct my Delphi Apps to look in a certain directory for DLL's? It can't be using Contstants, because there will be an option to specify where the DLL's are stored.
These DLL's are just a plain collection of StdCall functions in each, nothing special.
EDIT:
To explain the reason why I want to keep the DLL's in their own folder: This system I'm building considers these DLL's as add-ons. By default, the system might not even have any add-ons. On the other hand, it will also allow various vendors to build other DLL's and include them as add-ons. Then each application requiring these Add-ons will be directed to the folder where to find them.
The application its self will have its own DLL's which will be in the same directory as the applications. But the Vendors' DLL's I would like to keep separate.
As mentioned in the answers below, my best bet would be to implement the DLL Import method, because A) I can specify a path for each DLL it's importing, B) I can better control the use of each DLL (Does it need to be loaded or not?) and C) Each DLL can technically be in separate folders by themselves (Vendors might want to build their own folder structure). This system is still very pre-mature but I plan to make further flexibility with it.
If you are dynamically loading the DLLs in your code, you can store them whereever you want since you have to pass the full path to LoadLibrary/Ex() anyway. If you are statically linking to the DLLs instead, then you can use SetDllDirectory() to designate an additional path to include in the OS's DLL search path.
You can do this with PATH but I recommend you don't. It's a brutal and inflexible approach. And of course you need to change the system wide PATH for it to have any effect at executable load time.
You can load your DLLs explicitly with LoadLibrary and GetProcAddress. That's no fun if there are a lot of imports but it can be a good option otherwise. And remember that if you go down this route, every single DLL must switch to explicit linking.
There is something called DLL Redirection but MS don't recommend you use that. They recommend that you use side-by-side components. Having said that, the Visual Studio team moved away from side-by-side components with the MSVC runtime in VS2010 because of the pain that side-by-side had caused in previous release.
So, in spite of all the options, I really believe that the best solution is to put all the DLLs in the same directory as the executable. If you can get over the folder looking untidy then it will make life much simpler. It is a trivial no effort solution to the problem.
Update
The update to your question provides the extra information that these DLLs are optional add-ons. In this case you simply have no alternative but to use explicit linking with LoadLibrary and GetProcAddress.
I would highly recommend that you leave the DLL's in the same folder as the applications.
If you really want to go down the road of putting the DLL's in a separate folder then you need to know whether you can load the DLL's with the LoadLibrary API which allows the specification of the path too. However, if the DLL's are statically loaded then it is Windows that performs the search. The Windows search first looks in the application folder then searches the Windows PATH. Also, as Delphi 7 only creates 32 bit applications this can get messy under Windows 64 bit.
On Windows, there is a "DLL search order". One of those search paths is The directory from which the application loaded, which is why it works to have them in the same folder as the EXE.
If you are statically linked to the DLLs, they must be loaded when the EXE is loaded into memory. This is before your first line of code is executed. So, you're relying on the DLLs being in one of the search paths. In this case, you're stuck with setting the path, and you must set it prior to the program loading.
If you are dynamically linking to the DLLs, then you can use LoadLibrary/LoadLibraryEx to load the DLLs at run time, in your code. Using those functions, you must specify the path to the DLL, so the DLLs could be anywhere. In this case, I feel that it's valid to put the DLLs in a separate folder to keep things tidy. As long as you don't put the DLLs into a shared location like the Windows System32 folder, you'll avoid a lot of headaches.
A temporarly solution is:
You can set your DLL path in your application's Shortcut (in "Start in" box).
Related
For over a decade, I have been stuck (lovingly) on Delphi 6 and have developed hundred of thousands of lines of code in a logical (to me) unit structure, where a project tends to be a few hundred lines of code referencing high-level work routines in my "library". In trying to migrate to XE5, I just can't find a way to have all my library units compiled in one place on the search path, and then just used by the project (and recompiled if necessary), but that the dcus are off with the library source rather than in with each individual project.
I'm just starting to accept that "hello world" takes 2.5Mb in XE5, and I can't stand the idea that each library unit has to be separately compiled into dcus at the project level. In the "old" days, these unit dcus would sit next to the pas files and not be recompiled if nothing changed in the source file.
The obvious place to look is the Project options, but I can't find the right setting to make the project stop keeping copies of each dcu.
I am vaguely aware that multi-platform development will cause restructuring, but I can't help feeling that there is some compromise position.
There must be something big I am missing.
Starting in Delphi XE2, Delphi supports compilation for multiple platforms, as well as different build configurations. Because of this, Delphi needs to create DCU files for each combination. For example, Win32, Win64, and OS-X DCU files are saved in separate folders by default. Otherwise, if it weren't like this, the DCU files would overwrite each other, which you should avoid (if you use different configurations/platforms).
These settings can be changed in the Project Options in the very first section Delphi Compiler by modifying the Unit output directory. This is by default .\$(Platform)\$(Config) which creates a subfolder for the platform, then another subfolder for the config, for example \Win32\Debug\. Careful for the Target at the very top, which by default is set to your current platform/config. You would typically want to first change it to All Configurations. If you clear this field completely from the options, it will produce your default behavior from older versions.
It sounds like you should create a Package. This would allow you to group all your "library" units together in one place (BPL). This package can then be installed into your IDE, and if you have any components, those components can then be installed into your component pallet.
Or you can do without a package too. All the units from all these different projects should be moved to this central place though - a single folder containing all your "library" units. This way it's less maintenance, and you can just add that one folder to your global library path.
If you put your files in a central folder, and use those files from a project, the DCU files for both the project and this "library" will be saved for that project. Delphi doesn't know that these files are a "library", it just knows that you're using them, and since it can't find an already compiled version of those units, it creates one in your project. If you want the DCU files to be saved only once and in this central place, then you would need a package.
First, let me thank all the respondents to this question - all provided useful insight. I experimented with the various suggestions (including breaking XE5 so badly that I had to reinstall -- at least I learned some areas not to mess with.)
Important to me, but a known bad coding practice, is having individual projects edit shared library units (only my own units - I do not mess with code belonging to Delphi or 3-rd party). This is critical to having multiple applications working on the same data, but in bite-sized pieces. the shared code lets me make high-level pieces of an app available to other projects. There might be better ways (I would love to hear about them), but this has worked for me for a long time.
The multi-platform model really requires the dcu structure used by default, so I will adapt to it. Share the source code, but accept multiple compilations to individual projects. A good suggestion by JensG is just to clean up the dcus when the project is not actively being worked on. Should be straight-forward utility program.
The D6 -> XE5 migration (which will take months for some of the less used areas) requires me to know which units compile successfully, so I will maintain one project whose function is to include all units and recompile them all. This will make it practical to map library unit pas files to dcu files.
The AnsiString/AnsiChar <-> String/Char problem is the major migration problem area. Simply making edit-level changes may get the code past the compiler, but there is no guarantee that the code still works the same way. Especially troubling is at interface points to Windows calls and such. My answer will be to make the units compilable first, but then write test code for trouble areas. This is what will take the months - I need to get on with new stuff, as well as fixing the old. I REALLY don't know yet if I will be able to substitute the XE5 compatible code back into Delphi 6 without another layer of testing. I THINK it should work, but it will take careful checking.
A second major migration problem is 3rd party code such as VCLZip. XE5 has its own zip support, but I have a lot of places where I use VCLzip and the conversion will not be trivial. For this specific library, it may be possible to find XE5-level source and simply work it in. There are other pieces of code gotten from the internet that I used, but never needed to truly understand which will cause significant hassles.
Again - thank you to all. This has been an interesting 24 hours. Howard
I have a problem which I do not understand. I am using a DLL in my application. This DLL requires other DLL's and I have all of them. If I put the libraries in my appliction folder everything works fine.
However, having a bunch of DLLs in application folder looks quite ugly so I wanted to move them to application\lib subfolder.
After this change now I am getting External Exception when I try to use some of its functions.
I've only changed one line of code:
The original code
DLLHandle := LoadLibrary(Pchar(ExtractFilePath(ParamStr(0)) + 'External.dll'))
The code after change
DLLHandle := LoadLibrary(Pchar(ExtractFilePath(ParamStr(0)) + 'lib\External.dll'))
In both cases DLLHandle have a handle after loading the library. I am also not getting any error after calling GetProcAddress( DLLHandle, '_SomeFunction#8')
No exceptions, and return value of GetLastError is always 0.
Do you have any idea what could be wrong?
Thanks.
Life is far easier if you keep the DLLs in the same folder as the executable. That's the first folder searched when libraries are loaded. To move all the DLLs into a sub folder of the executable directory requires cooperation from all DLLs.
Most likely you have secondary DLL dependencies that are not cooperating. So exe loads A fine, but then A fails to load B. You can debug this further with Dependency Walker running in profile mode. It's quite possible that a secondary DLL is being loaded with implicit linking and that this throws and exception. Whatever the cause, Depenency Walker will lead you to the problem.
Whilst you can modify the PATH variable this is generally not advisable. If you do choose to go down this route then don't modify system wide, just modify the executable process environment at runtime before the first LoadLibrary. This is tenable so long as all your DLL linking is explicit using GetProcAddress.
All accepted wisdom recommends that you put your DLLs in the same folder as your executable. I would echo this recommendation. If you did this then you would be able to use implicit linking which would greatly simplify your code.
Yet another option may be to abandon DLLs and link everything straight into your executable. Unless you have a plugin type architecture, a single big exe is by far the simplest approach.
The other DLL's that need to be loaded must be on the system path for Windows to find them. Your application can find the External.dll as you explicitly define the path. Try adding the lib folder to your system path.
The company I work for develops a system in Delphi, that contains dozens of exe modules, and each of them is identical to a certain degree if it comes to source code. Sadly, nobody has ever cared about using libraries to put the shared code in. This means that each time there is a bug fix to do in the code all these modules share, a programmer has to make corrections in all of them separately! It always takes so much time...
I decided to find a method to put the shared code into libraries. I considered DLLs and BPLs. In this case BPLs seemed much more programmer-friendly and much less troublesome, especially that the code is used only in our software and only in Delphi.
I put all the code shared by all the exe modules into BPLs and everything seems fine, but there are certain things I don't understand and would be grateful if you explained them to me.
What I expected after dividing the code into BPLs was that it would be enough to deploy exe files with the BPLs I created. But it turned out that they need an rtl100.bpl and vcl100.bpl as well. Why is it so? I want to deploy exes and my BPLs only. I don't want to provide end users with a whole bunch of libraries supplied by Borland and third party companies :). I want them to be compiled within exes as they used to be compiled before. Is it possible to do that?
What I did so far was:
I put all shared pas units to BPLs. Each BPL contains units belonging to the same category so it is clear for programmers what code to expect in a given BPL.
Each BPL is a "runtime and designtime" library.
Each BPL is "rebuilt explicitly".
The two latter are default project settings for BPLs.
And if it comes to the exe projects:
I deleted all units that I had earlier put to BPLs.
I installed my BPLs from the Tools->Install package menu in BDS 2006.
In my exe project settings I checked the option "build with runtime packages" and I listed all my BPL packages in the edit box below (only my packages, as I cleared all other ones that appeared there).
This is all I did. The exe projects compile properly, but I have no access to the source code of BPLs (I can't navigate into that code from my exe projects), even though all BPLs are stored together with their source code files. Why? It seems strange to me.
I always tend to write lengthy descriptions - sorry for that :). I will appreciate your help. I just need a few words of explanation to the points I mentioned: deploying exe with my BPLs only, the correctness of what I did as a whole, and the inability to navigate into BPL source codes. Thank you very much in advance!
Thank you all for the discussion. Some said the approach I chose was not a good idea. Our software consists of more than 100 modules (most of them being something like drivers for different devices). Most of them share the same code - in most cases classes. The problem is that those classes are not always put into separate, standalone pas units. I mean that the shared code is often put into units containing code specific to a module. This means that when you fix a bug in a shared class, it is not enough to copy the pas unit it is defined in into all software modules and recompile them. Unfortunately, you have to copy and paste the fixed pieces of code into each module, one by one, into a proper unit and class. This takes a lot of time and this is what I would like to eliminate, choosing a correct approach - please help me.
I thought that using BPLs would be a good solution, but it has some downsides, as some of you mentioned. The worst problem is that if each EXE needs several BPLs, our technical support people will have to know which EXE needs which BPLs and then provide end users with proper files. As long as we don't have a software updater, this will be a great deal for both our technicians and end user. They will certainly get lost and angry :-/.
Also compatibility issues may happen - if one BPL is shared by many EXEs, a modification of one BPL can bee good for one EXE and bad for some other ones - #Warren P.
What should I do then to make bug fixes quicker to make in so many projects? I think of one of the following approaches. If you have better ideas, please let me know.
Put shared code into separate and standalone pas units, so when there is a bug fix in one of them, it is enough to copy it to all projects (overwrite the old files) and recompile all of them.
This solution seems to be OK as far as a rearly modified code is concrened. But we also have pas units with general use functions and procedures, which often undrego modifications - we add new functions there whenever necessary, but in single projects. So imagine that you write a new function in one of the 100 modules and put it into its general use unit. After a month or two you modify a different module and you think you need the same function you wrote 2 months ago. You have to find the module (it's difficult if you don't remember which one it was) and copy the function to your code. And obviously - the general use units become completely different in each module as long as they are stored in each project separately. And then, if there is a bug fix to do... the whole story repeats.
Create BPLs for all the shared code, but link them into EXEs, so that EXEs are standalone.
For me it seems the best solution now, but there are several cons. If I do a bug fix in a BPL, each programmer will have to update the BPLs on their computer. What if they forget? But still, I think it is a minor problem. If we take care of informing each other about changes, everything should be fine.
#CodeInChaos: I don't know if I understood you properly. Do you mean sharing pas files between projects? How to do that? We store source codes in SVN. This means that we would have to store shared code in a separate folder and make all projects search for that code there, right? And download from the SVN a project and all folders it is dependent on...
Please, help me choose a good solution. I just don't want the company to lose much more time and money than necessary on bugfixes just because of a stupid approach to software development.
Thank you very much.
Even though this question has an accepted answer I'm going to take a stab at it.
The title asks how to divide a project into bpls but the real question appears to be:
"What's the best way to share code between projects?"
There are a few ways to do this:
Shared units
Dlls
BPLs
Regardless of which direction you go you will likely need to restructure your projects. From your description it sounds like each project is developed in relative isolation. Code is shared using copy/paste, which quickly gets out of sync and result in a lot of duplicated effort. So lets examine each of the techniques for sharing code.
Shared units
This is the most straightforward approach. You create a shared location and place code you would like to reuse among your projects into this location. The units are statically linked into your projects so you don't need to worry about deploying extra dependencies along with the main executables. Statically linked units are by far the easiest to troubleshoot and debug.
The compiler needs to be able to find your shared units. There are 4 ways to tell the compiler where to look.
Add them to the project - SHIFT+F11 - Adds a reference to the unit into the project files (dpr, dproj). The IDE will normally use relative paths if the unit is located under the same directory tree as the project files, otherwise it will use absolute paths, which can be problematic if developer machines aren't configured identically.
The project's Search Path - CTRL+SHIFT+F11 Delphi Compiler > Search path - Add a directory and the compiler will look there to find units mentioned in the uses clause of any unit in the project. Its best to use relative paths if you can. You can also use environment variables: $(MyPath)
Global Search Path - Tools > Options > Environment Options > Delphi Options > Library - Win32 > Library Path - Any paths listed here are available to all projects on a machine. This is machine dependant
Command line - If you build from a script or build automation tool you can set the search path using the dcc32's -U switch or msbuild's /property:UnitSearchPath= switch.
Options 1 and 2 will be the most useful.
As far as your SVN repository goes you have a few options for organizing the projects and shared units. The simplest would be to place all projects under single trunk along with the shared units:
Projects
trunk
ProjectA
ProjectB
ProjectC
Library (shared units)
If for some reason the above structure isn't possible you could try this alternative:
ProjectA
trunk
Library (branch of main library)
ProjectB
trunk
Library (branch of main library)
ProjectC
trunk
Library (branch of main library)
Library
trunk (main library)
In this configuration changes made to each project's library folder would not be immediately available to the other projects. Each project would need to synchronize changes with the main Library project on a regular basis. A side effect of this is that changes that break other projects will be delayed until the other projects are synchronized. Whether you consider this a good or bad thing depends. On the one hand bugs are easier and cheaper to fix when the code they involve is still fresh in the developer's mind. On the other hand if you don't practice unit testing (which I highly recommend you do) or the code is very fragile or you just have developers prone to making reckless changes you may want to control how frequently those changes get pushed into other projects.
Dlls
Dlls allow you to share code by linking to it at runtime. They expose functions that can be called from a main executable or another dll.
While dlls are always linked at runtime you decide whether they are loaded at application startup or only when needed. Loading at startup is called static loading and in Delphi is accomplished using the external directive. The vast majority of the rtl/vcl classes that wrap system api calls use static loading. Dynamic loading lets to delay the loading of a dll until it is required. This uses the WinAPI functions LoadLibrary and GetProcAddress. A corresponding call to FreeLibrary will unload a dll.
Unfortunately standard dlls limit what kind of datatypes can be passed. If you need to access a dll from non-Delphi projects you will need to limit yourself to using c style data types. If you will only be using a dll with Delphi projects you can safely use Delphi strings and dynamic arrays as well if you use the SharedMem unit in the dll and any projects that use it.
You can safely use object's within the dll without problems but if you want to pass objects between the dll and the application you'll need to extract the object's data and pass it as primitive types and reassemble it into an object on the other end. This is called (de)serialization or marshalling and there are much easier ways to do this than rolling your own.
COM (Component Object Model) is well supported in Delphi but it has a bit of a learning curve. Consuming COM objects is pretty straightforward but designing one will take time if you're not familiar with COM. COM has the advantage that it is language neutral and is supported in the majority of languages targeting the Windows platform (including languages targeting the .NET framework).
Bpls
Bpls (also called simply "packages") are specially formatted dlls that make working with objects a lot easier. Like standard dlls they are linked at runtime and can be statically or dynamically loaded. They are easier to learn and use than COM dlls and provide more seamles integration into your projects than COM. Packages are composed of two parts: the bpl and the dcp. The dcp is like the dcu files generated when you compile a normal unit file except it contains a whole bunch of units in it. Using a class that is compiled in a bpl is as simple as adding the dcp to the project's package list then adding a unit to a uses clause of one of the project's units.
When you deploy the app you'll need to install the bpl as well. As other's have noted you have to include the rtl package at a minimum and most likely the vcl package if you use any forms. There is a way around deploying Borland supplied bpls with your projects. You can create a "mini" rtl package that contains only the units your project need. The difficultly is in determining which units to include.
Summary
From the description you've given creating a library of shared unit files to statically link against may be the most expedient route. I would also suggest trying out a program called Simian. It will help you track down duplicate code in your code base for inclusion in your shared library. It doesn't directly support pascal but it does a decent enough job using the plain text parser with a little tweaking of its configuration.
Also I can't stress enough the value of unit testing. Especially if you're moving toward shared libraries. A suite of well written unit tests run on a frequent basis will give you instant feedback when a developer changes a class and it breaks an unrelated project.
Imagine you have a project with an EXE and two different BPL modules, and somewhere in that codebase, there's a line that says if MyObject is TStringList then DoSomething;. The is operator works by examining the object's class metadata, stored in the VMT, and then following a chain of VMTs through the ClassParent pointer, to see if any of them match the class reference (also a VMT pointer) for TStringList. In order to make sure that this will work correctly, there needs to be one single VMT for TStringList that's the same throughout your entire program, no matter how many BPLs it's divided up into, which means it has to be in its own package. That's why system runtimes like rtl*.bpl and vcl*.bpl are necessary, and there's not much you can do about that. It's part of the price of using BPLs.
As for not being able to debug, you need to make sure that the BPLs are built with debug info enabled and that the debugger knows how to find the folder where the DCP (the file containing the debug info for the BPL) is located. And you won't be able to trace into system BPLs, because debug-enabled DCPs weren't shipped with your version. They were added pretty recently, I think in XE but it might have been in D2010.
Why can't I browse my source code? Is there a way to fix this?
You can not browse the source code of the units included in the packages because they are neither in your project, your library or search path.
The way I solve this is adding the directories to the project search path. This way the compiler does not know about those files (and does not try to recompile them) but the IDE let's you browse their content and debug into them.
"In my exe project settings I checked the option "build with runtime packages"
That is why you cannot deploy without the BPL's etc - this option is confusing for a lot of developers -"build with runtime packages" means that you will need the bpl's present at runtime. Uncheck that option and the packages will be linked into your exe at compileTime. (Your exe will g-r-o-w in size.) The idea behind the "build with runtime packages" is to keep the size of exe's down and allow several apps to share common bpl's because they are NOT linked into the exe # compileTime - that's the upside. The downside you are now experiencing - you must distribute your bpl's with your exe.
I am trying to start making my own libraries avaialble as packages prior to compiling my Apps with these packages hence modularising my code. For years I've 'sort of' understood packages, breathing a sigh of relief when I load a component package and click on 'Install' and it does. I understand that the process of installing a component (or components) is via the creation of a BPL which is then registered with the IDE.
Where I begin to get lost is how to make files available so that I can compile with EITHER a package OR pre-compiled dcu's (like the third party vendors do) and without pointing my project at the source code all the time. I can create a package with the following settings:
where I've specified that all my output will go into 'c:\scratch\wow'. After a build I find TEST.BPL, TEST.DCP and lots of DUC's. Now, when I point another project at this folder to use the DCU's, I get a missing DFM error (one of the units is a form). Should I be manually copying needed DFM's into this output folder? The DPK knows about this form, so why do I not get the DFM copied for me? I presume that using TEST.BPL, that file contains everything, but I wish to work in the two modes. Of course I can get around this by including the source folder in my project search path to find the DFM but third party libraries seem to already have the DFM's in their output folder. Did they install them there using the installer?
Thanks
instead
As others say you could use post-build events to copy your DFM files into place. Other people use a one-time external batch file that copies the DFMs to the DCU folder.
Personally I see very little benefit to making packages for things which are not developed also as reusable components. I also see very little benefit in partitioning an existing application into packages, when you don't reasonably need to use the same subsection or package more than once, or at designtime.
Things I would put into packages:
Delphi visual and non-visual components.
Things which absolutely must be plugged in at runtime, or left out. For example, supposing I sell MetaWare Light and MetaWare Pro, and instead of using compiler IFDEFs to build a differnt binary, I preferred for some reason to simply not ship the ADVANCEDFEATURE.BPL with my systems.
Things to beware of with packages:
I have run into a lot of compiler bugs when combining packages with generics. I have also run into IDE crashes and lockups, in Delphi 2009, 2010, XE and XE2. (I believe XE3 is better)
You should learn a bit about BorlandMM.dll and shared memory management in the BPL world before you move to it. There are some subtleties.
Packages limits the ability of the linker to decide what to remove. In fact, it pretty much destroys it. Packages contain everything that is linked into them, and nothing publically accessible can be removed.
Once you've created a binary package and shipped it to even one customer, you have a pretty difficult to modify contract (this BPL contains a particular signature or application binary interface) you have to be careful in the future to never change them, or mix and match them. Beware of DLL hell, even among your own customers, and be prepared to use versioning on your packages. Just as delphi packages have a version suffix, I recommend you use version suffixes in your own packages right off the bat, and bump them whenever binary compatibility has changed.
Delphi handles build dependencies between packages about as well as could be hoped, which is less well than a single monolithic application. In applications that I have that make heavy use of packages, I find project groups that contain a bunch of packages that depend on each other are very difficult to manage and build quickly. In fact, I have experienced that both compiles and builds are slower and more frustrating than they would be in a singular 750Kline megaproject.
I really wonder if you're not that into the package area of Delphi (you breath a sigh of relief whenever a delphi component actually builds and installs without issue?) if you really want to move into the Packages World totally. By all means, you should experiment. But I wouldn't bet the farm on it yet. Learn some more first.
Yes, you should copy the .dfm to the directory with the compiled units (.dcus), if that is the only directory you want in your search path. The BPL will of course contain the .dfms, and you need a .dcp to be able to link a BPL with your app.
Third party tools must have put the .dfms together with the .dcus in the directory using their installer, indeed.
Instead of copying *.DFM manually you can use Post-Build Event (Project/Options/Build Event), ex:
copy “$(PROJECTDIR)\Unit1.DFM” “c:\Scratch\wow\Unit1.DFM”
I found a way to do this without moving .dfm files to the directory of .dcu files, so you can have a directory for .dcu files only one for .dcp files only and another for .bpl files only.
All you need to do is to create another directory on your good structure, as I do. The directory is called RES and in it should be placed all the resource files (.res files, not .dcr files) that are used by applications compiled using your packages (components). In the Delphi Library Path, you must include in addition to the DCU directory (you should already have) a directory named RES.
On your component (design time) do everything you want with the form (design it, put other components, etc). In the source code of the unit you replace {$R *.dfm} with {$R UnitName.dfm}. In doing so, save all and close the DPK. Now move the .dfm file (do not copy, move!) to the RES folder (the .dfm file is a resource file to the Delphi. The {$R} directive is proof!) and after that open the DPK again to understand what has changed.
First realize that you may not open the form (F12) from his unit, though no error was issued by Delphi about "DFM missing".
Now, do a Build on your package and then install it. Realized again? No errors displayed! This happened because you have indicated the location of .dfm file in the Delphi library search path (RES directory).
Done! You can use your component and dfm will be found when your component is included in an application.
Many of you can now say that this way I will not longer be able to visually edit a form in the component design time. Yes this is true, but if you think about it, why would I want to change so often a form into a component that, in practice, should only be used and slightly edited? Draw your own conclusions ;)
I am working in a delphi IDE expert , now to avoid dependencies problems, i was thinking in rebuild this expert as a dll expert as was suggested in one of these answers, now my expert (compiled as bpl) access the Screen and Application global variables (instances of the Delphi IDE), so i was wondering if i compile my expert as a dll i can still accesing these variables and also i want to know which are the main differences between a bpl delphi expert and a dll delphi expert?
Should I compile my wizard as a DLL or a Package? Packages are easier to
load and unload without restarting the
IDE (and hence easier to debug), but
they can create unit naming conflicts
in the IDE. Conflicts happen when the
name a wizard's unit matches the name
of a unit in another loaded
design-time package. In this case,
both packages can not be loaded at the
same time. The recommended workaround
is to prefix all of your unit names
with a "unique" prefix. GExperts, for
example, uses "GX_" as the name prefix
for its units.
From this very good source about OTA: GExperts
When you access a global variable those would be global variables that are global to your DLL, not global to the main BDS.exe. I am not sure but I think your DLL would have its own Screen and Application global variable, if you linked in Forms, and the core of the VCL.
Those things which belong to the IDE itself are accessed through the Open Tools Api (OTA). I believe that you do not normally share any objects between the IDE your expert anyways, and if you were to try to do so, it would be problematic. Anything at all that you do that bypasses the OTA is going to be vulnerable to breaking in strange ways, especially in future versions of the IDE.
Dependency problems are of course a big reason to not use BPL based packages, but I think an even bigger reason is to maintain a complete separation between your tool's internals, and the internals of the IDE.
Remember that a DLL target, like an executable target, is statically linked. That is the core of the difference. If your expert provides functionality that uses the legal public documented OTA interfaces only, then moving to a DLL should be problem free. If you use some back door hacks that are possible with BPLs, then I can't advise you further.