Should I be doing any periodic cleanup on my IDE (Embarcadero RAD studio)? - delphi

It seems like over time Embarcadero RAD Studio (I use XE4 Starter for programming Delphi) has gotten more and more sluggish. I know with other IDE's I've used sometimes you end up with bloat and detritus of temp files and cache files and other unneeded things the IDE generates, and periodically it can help to manually go in and clean those things up.
Are there any particular file removal, or other IDE maintenance a programmer should do periodically to keep RAD Studio running smoothly at peak efficiency?

There are really many angles to how the IDE performance is affected. And there's essentially two levels to look at it on: What's installed in the IDE, and what's required by your immediate project.
IDE Maintenance
There's not necessarily anything you should have to do on a regular basis to your IDE. However, you should make sure that your library path is up-to-date. This is one of the most common things I ever need to manage in the IDE.
Then there's the third-party libraries, components, add-ons and fix-packs that you can install onto the IDE. Each one of these has the possibility of slowing things down, so make sure you only have things installed that you actually use.
Project Maintenance
Performance of the IDE isn't usually hindered by maintenance of a project. However, a very large complex project (or project group) may wind up slowing it down. Cleaning of the DCU files of a project shouldn't be necessary on any regular basis, but can often help clean up and forcefully re-compile anything which the IDE may have neglected to keep up to date. I've seen issues which were solved by deleting DCU files and recompiling them. Remember, DCU files are essentially compiled code, and is mostly for the purpose of caching the compilation so that it only needs to compile those units which have changed since the last compilation. So, cleaning up the DCU's will result in the next compilation taking a little longer than usual. Still, not much worth the pain.
Conclusion
There's a batch file I use commonly to clean up unnecessary files in different project folders. That batch file is scripted like below, and simply deletes any temporary files I don't need to keep around. But it's only needed on rare occasions when I'm cleaning up the source, and not really on any schedule.
del *.dcu
del *.dof
del *.dsk
del *.identcache
del *.local
del *.~*
del *.cfg
del *.dsm
del *.rsm
del *.otares
Note that the same or similar can be accomplished by choosing the "Clean" option in the IDE for any given open project.
Edit
You can also accomplish best IDE performance by keeping track of how your project interacts with the IDE. For example, components or controls which perform heavy actions (such as database connections with data aware controls).

OP hasn't characterized the nature of HOW the IDE "has gotten more and more sluggish," so anything anybody says here is little more than random speculation.
First off, I've never used the Starter Edition of Delphi. Aside from that, I've never noticed any particular slow-down on different versions, other than what you'd expect from normal stuff. That is, if you load bigger files, it can take longer to do stuff in the IDE. More installed libs and components take longer to load at startup, but impose negligible impact inside the designer. Deleting files like those above can slow down builds because the compiler needs to reproduce them; but if you're not using that project, then there shouldn't be any impact whether they're present or not. And poorly designed components can take a toll on IDE performance.
It's entirely possible there may be some optimizations turned off in the Starter Edition, or things missing that might result in performance hits over time. It's not as "feature complete" as the full IDE, although I've never investigated the differences. I just stick to full released products because I use them for production work.
Also, maybe it's time you upgraded your computer. You could be dealing with limitations in virtual and physical memory that have nothing to do with Delphi, which is a rather large app in its own right. I run Delphi inside of a VMWare VM on a MacBook Pro, and it works fine unless the memory gets tight, then it can slow way down. (Again, without more specifics, it could really be tons of different things.)

Related

Building VB6 projects without registering

We have an old VB6 project that uses ActiveX controls, some of which we build and others we get from third-party vendors.
Currently, we use a .csproj project which does the following,
Execute regsvr32 to register the OCXs
Execute vb6 to build the VB6 project
Execute regsvr32 to unregister the OCXs
This registering/unregistering is ugly and is a bit of a pain for local developer builds with UAC enabled. Is it at all possible to build a VB6 project without having to register any controls?
I apologize if this has already been asked before. The only similar questions I was able to find were about how to build VB6 projects, and answers to these mention the same solution of register, build, unregister.
It sounds like these people are merely working on clients of these OCXs rather than modifying and recompiling the OCXs themselves.
If so, you should be administering the installation of these libraries just as you administer the VB6 development system itself. This means each workstation needs to have the control suites you are using installed once (well, and maintained when new releases are placed into use). Installers for developer libraries deploy things like .DEP files as well as design-time license key registry entries, so using regsvr32 shouldn't be considered a viable strategy anyway.
If you set the developer workstations up properly and maintain them there isn't any reason to be registering and unregistering such things.
It means the original developers probably did not set the "binary compatibility" correctly. Which means the VB6 dll's get a "new com guid" every time they are built.
Which means your original VB6 developers were probably a bunch of hacks.
You can read the section here on Binary Compatibility.
http://support.microsoft.com/kb/161137
Get in a time machine and go back and punch the person in the face who said "We don't need
to work out the binary compatibility issues now, we'll just unregister and re-register the components... Easy Peezey!"................
If I'm wrong, please let me know. But every time I've seen "unregister the com" and "re-register the com".........it goes back to that brainiac decision.
Here is a longer discussion on it:
http://www.techrepublic.com/article/demystifying-version-compatibility-settings-in-visual-basic/5030274
EDIT:
If the ocx's are not changing........then you should only have to register them once on the build machine once.
The direct answer is no, it is not possible to compile a VB6 project with OCX dependencies without those dependencies being registered.
Furthermore, the act of compilation itself involves VB6 attempting to register what it has just built (unless you are compiling to an EXE). This generally requires the VB6 IDE and/or its compiler to run with "admin" permissions. Therefore the permissions are a hard to avoid issue regardless.
I believe these issues can be obfuscated by the fact that VB6 itself (the IDE and/or the runtime) will sometimes try to automatically register certain things for you, but will keep silent when it does so.
You should probably create a different process to setup a development PC from the build process you use from deployment. This may "feel" wrong especially if you have experience with other programming environments, but I would stress that VB6 can be very painful & problematic to work with and so pragmatism is generally in order.
On the development PCs: Setup all the unchanging dependencies once (and document them) and then leave them alone (as noted in another answer.) When weird dependency problems occur, verify the PC is setup correctly before doing anything else.
If you have all the sources to your dependencies, then I would consider if you can actually run them all in a VB6 project group (VBG) and not compile them at all. (A VBG is akin to a .NET solution though far less powerful.) I do this often and it cuts out a lot of wasted time. Developers don't necessarily need code compiled to EXE / DLL / OCX - they often just need to be able to run it in the IDE.
On the build PC: If you can always start with a clean environment, like in a virtual machine, then I think its actually a good idea to register everything from scratch in an automated fashion as this helps to verify nothing is missing or mismatched. Re-using the same build environment without doing this can mask problems when some dependency has changed in source control but still exists on the build machine. On a VM generally permissions aren't a limiting factor.
Notes:
If you are building an EXE, VB6 does not require any elevated permissions, as far as I can recall.
Running code in the VB6 IDE does not either.
[Caveat 1]:
It may technically be possible to create a side-by-side application manifest file for VB6.exe itself and include in that manifest whatever dependencies you need, thereby avoiding having to register them.
But this would fall well outside of the normal ways to use VB6 tools - its a hack - and possibly is not worth the potentially large effort. I don't think I've ever seen a working example and so I don't recommend this as a practical solution, but mention it for completeness.
Maybe in some locked-down corporate IT scenario this could pay off... maybe. In that scenario doing dev work in a VM might be a better option though.

Delphi 6 - Bugs disappear when I compile multiple times

My Delphi installation has been going downhill for the past few months. It seems though that every so often when I build a release it has strange errors in it which are resolved if I build, then compile, then build, compile, etc.
I've talked to another developer who thinks that this is a compiler error. This sort of degrading performance over time has happened on other computers to us too.
What does stack overflow think could be the problem.
What I've seen most is a case where multiple versions of the same units/dcus exist in different folders/paths, and depending on almost insignificant variations the compiler/linker uses a different path and picks different versions of the units to build the exe.
I would make a huge Spring clean-up, scrutinize the lib/search paths, remove all dcus and make sure there is no duplicate versions of any unit.
And, agreed, reinstalling Delphi could help start with a clean state.
I agree with #François about the DCUs, but also want to point out an observation: sometimes it matters what was built prior to what you're building. i.e. if you have several projects that contain source code that results in various .dcu/bpl files being created in a common directory, but the project that you're concerned with doesn't explicitly call for them to be rebuilt, then you're going to end up with whatever is there. If you clear the dcus/dcps prior to building, and then find that your project doesn't build, then you are missing a uses/requires clause somewhere. Every project shoudl be able to build on a "clean slate", and not rely on leftover binaries.
That's not much to go on, but it sounds like a classic case of "bit rot". Too many things interacting in too many ways for too much time under a poorly-designed OS, leading to strange forms of data corruption.
First thing I'd do is uninstall Delphi and reinstall. If that doesn't work, try reinstalling Windows. (If it's been around long enough for this to be happening, you're probably due for an OS reinstall anyway.) And if that doesn't work, contact Embarcadero tech support.

Grouping DLL's for use in Executable

Is there a way to group a bunch of DLL's and still use them at run time (not zipped up). Sorry this question sounds terse and stupid, but I'm not sure what more to ask.
I'll explain the situation though:
We've had two standalone Windows Applications and now one of our Applications has swelled to such ungainly proportions that the other application cannot run outside of the scope of the first app. We want to maintain some of the encapsulation we had while letting the smaller program in on some of the bigger program's features.
There is no problem in running the application, other than we don't want to send out all the 20-30 DLL's that the smaller project has.
It is possible to do this by adding startup code which checks if the DLLs are present on the target system and if not then extracts them from the resources section (or simply tagged onto the end of the exe). A good example of this being done is Process Explorer - it's distributed as a single binary, but when run it extracts and installs a driver.
If you have a situation where most, or all, of those assemblies have to be kept together, then I would highly recommend just merging the code files into the same project and recompiling. This would leave you with one assembly.
Of course there are other considerations like compile time, overall size of the final dll, how often various pieces change, and whether each component is deployed without the others.
One example of a company that did this is Telerik. Their dev components are all compiled into the same assembly. This makes deployment an absolute breeze. Contrasting that is Dev Express which put just about each control into it's own assembly. Because of this just maintaining, much less deploying, a Dev Express project is not something for the faint of heart.
(I don't work for either of those companies. However, I have a lot of experience with both toolkits.)
You could store the DLLs as Resources, and use BTMemoryModule, which essentially allows you to LoadLibrary on a Stream.
That way you could compile-in the multiple DLLs straight into the EXE or into a single resource DLL.
see http://www.jasontpenny.com/blog/2009/05/01/using-dlls-stored-as-resources-in-delphi-programs/

Why does building with runtime packages make the EXE file smaller?

I have a query about the option in Delphi to build with or without runtime packages (Project->Option->Packages).
The executable file size seem to be smaller (389KB) when I checked the box "Build with runtime packages" compared to when I uncheck the box (3,521KB). Why is that the case?
I am having so much trouble building an installation disk for it and can't figure out what files should be included in the installation. I wonder if this might have anything to do with it, but I have tried both options already.
When you build with runtime packages, the VCL and RTL are loaded from the packages and so their code doesn't have to be linked into your EXE. So the EXE gets smaller, but the total installation gets larger since you can't use smart linking to reduce the size of the packages.
As you've already noticed, using packages causes trouble for memory leak tracing, and it also causes trouble for debuggging. It's generally only worthwhile to use them if you're using plugins that will also need runtime packages.
The answers so far miss one crucial point: Runtime packages are useful in the same way as DLLs are useful if you have a suite of applications that work together and are installed together. You could of course link the VCL and third party libraries into all of them by building them without packages, but depending on the number of applications and used libraries the size of these applications combined will be larger than the size of them built with runtime packages plus the size of those runtime packages. This will make for larger setup packages, which isn't the big issue it once was.
But using all these applications at the same time will also bring a much higher load for the system. Since every application uses its own copy of the VCL and the other libraries all these need to be loaded from disc into memory, which causes more I/O. And then there will be several copies of them in memory, each taking up space for the code. When runtime packages are used each application will have its own memory area for data, but they will all share the same copy of the packages' code in memory.
For a single self-contained application without any special needs definitely build without packages.
Regarding your question "what files should be included in the installation": you can use Dependency Walker to track down the library dependencies.
One of the main reason for using run-time packages is when you need module granularity to deploy/update over a medium that does not accept well large files, like over a wire with a low bandwidth.
Because the run-time packages remain the same until you change your Delphi version - like forever for those still on D7 ;-) - it allows to deploy new versions or new applications without the payload of the RTL/VCL.
But like with DLLs, you have to be careful with the versioning.
Don't know about D2010, but in D2006 there is an option in the project menu called "Information for ProjectName".
This will show you which packages are included after you compile.
However, as Mason has stated, there is little advantage to using run time packages, and quite a few disadvantages.

Delphi: How to organize source code to increase compiler performance?

I'm working on a large delphi 6 project with quite a lot of dependancies. It takes several minutes to compile the whole project. The recompilation after a few changes is sometimes much more longer so that it is quicker to terminate Delphi, erase all dcu files and recompile everything.
Does anyone know a way to identify, what makes the compiler slower and slower? Any tips how to organize the code to improve compiler performance?
I have already tried following things:
Explicitly include most of the units in the dpr instead of relying on the search path: It didn't improve anything.
Use the command line compiler dcc32: it isn't faster.
Try to see what the compiler does (using ProcessExplorer from SysInternals): apparently it runs most of the time a function called 'KibitzGetOverloads'. But I can't do anything with this information...
EDIT, Summary of the answers until now:
The answer that worked best in my case:
The function "Clean unused units references" from cnpack. It almost automatically cleaned more than 1000 references, making a "cold" compilation about twice faster. ("cold" compilation = erase all dcu files before compiling). It gets the reference list from the compiler. So if you have some {$IFDEF } check that all your configurations still compile.
The next thing I would like to try:
Refactoring the unit references manually (eventually using an abstract class)
but it is much more work, since I first need to identify where the problems are. Some tools that might help:
GExperts adds a project dependencies browser to the delphi IDE (but unfortunately it can not show the size of each branch)
Delphi Unit Dependency Viewer V1.0 do about the same thing but without Delphi. It can calculate some simple statistics (Which units is the most referenced, ...)
Icarus which is referenced on a link in one of the answer.
Things that didn't change anything in my case:
Putting every files from my program and all components in one folder without subfolders.
Defragmenting the disk (I tried with a ramdisk)
Using a ramdisk for the code source and output folders.
Turning off the live scanning antivirus
Listing all the units in the dpr file instead of relying on the search path.
Using the command line compiler dcc32 or ecc32.
Things that didn't apply to my case:
Avoiding having dependencies on network shares.
Using DelphiSpeedUp, because I already had it.
Using a single folder for all dcu (I always do it)
Things that I didn't try:
Upgrading to another Delphi version.
Using dcc32speed.exe
Using a solid-state drive (I didn't tried it, but I tried with a ramdisk where I put all the source code. But maybe I should have installed delphi on the ramdisk too)
Some things that could slow down the compiler
Redundant units in your uses clause. See this question for a link to CnPack.
Not explicitly adding units to your project file. You've already seem to have covered that.
Changed compiler settings, most notably include TDD32 info.
Try to get rid of unused units in your uses clause and see if it makes a difference.
using Delphi 7 and 2009, last week I pass from almost 2 minutes for compiling and another 45 seconds from hitting f9 and get the main form of my app to 20 seconds compiling and running. This things has drive me crazy for about 6 months and nothing I tried seems to work. Using filemon from SysInternals, I realize than every unit (mostly components) that compiler requires was searched in every folder that was in Search Path, yes, this produce a LOT of FileOpen, FileExists and FileNotFound, etc. What I do was, put every DCU, DFM, RES, etc from components all in a single folder, and having just this folder in the search path, and a couple of others folders required by the project; the results were amazing. Other problem prior to the fix, was debugging. It takes almos 40 seconds in each F7, F8 key press while debuging, this has been fixed too. Hope this info can help you. Greetings form Isla de Margarita, Venezuela. Excuse my english, if any error ;)
Check are there any paths in search paths that aren't on your local machine.
i.e. Don't link to binaries on network shares, and check that the search path isn't checking any network shares.
I haven't seen the compiler get slower over time, but it's been a long time since we used Delphi 6.
It seems to be generally agreed upon in the Delphi community that, if you don't want to upgrade to the latest and greatest (Delphi 2007 or 2009), then Delphi 7 is the best/fastest/most stable. You might consider upgrading.
KibitzGetOverloads sounds like something from the kibitz compiler -- the "background" compiler that gives you code-completion, background error highlighting, code tooltips, etc. Sounds like you'd be better off checking the call stack of the command-line compiler, not the IDE; you'd get something more helpful.
I have never found compiles to be faster after deleting the DCUs. DCUs are there to make the build incremental, therefore faster. If you're seeing faster compiles after deleting all DCUs, check your hardware. Have you defragged your hard disk lately? How much free space do you have on the drive?
Have you set a single folder to get the DCUs. If not, they will be scattered all over.
Put all the units and their implicitly called units (except installed components from Library path) in the dpr. To be sure you did not miss some, empty your search path, it still should compile.
After reducing the search path, you can try to reduce your library path by installing your components into fewer folders.
Although only partly relevant to your exact question, I hear that the use of a solid-state drive is vastly increasing compile time with Delphi - Nick Hodges said this himself on the Delphi Podcast a couple of week ago.
Brian
U can automatically get rid of
unnecesseary unit references, which is very efficient optimization for compiling speed.
In your situation, dividing your
project into packages can improve
compiling speed. With this way, it
just generates modified package(s),
not single massive binary for each
recompilation. Working with packages
can also help about easy deployment
of your project updates.
Turn off your live scanning antivirus
We had the same (or similar) problem.
I of our package has compilation Time about 12 min.
After changes, now we have moved to 32 sg.
After many tests we found that the "problematic situation" was the following:
In a single package:
The A unit uses a large number of units: U1, U2, U3, U4, ... U100 (Uses of Interface) in the same package. This is an important unit that centralizes all the initialization work.
All units of the package, U1, U2, U3, .., U100 uses unit A (use of implementation)
This "circular reference" does not give compilation errors because the USES are different, but caused a large compile-time.
SOLUTION:
Eliminate the reference to each unit, U1, U2, U3 ,...., U100 in the A Unit.
Now, A unit use a large number of units: U1, U2 ,...., U100, but the units U1, U2 ,..., U100, does not use the unit A.
After this change the compile-time is down drastically.
If you have a similar situation, you can try this.
Excuse for my bad english.
Greetings.
Neftalí -Germán Estévez-
I had the same problem and I can come up with (2) reasons it effected me.
Circular references. The gentleman who stated that one was correct. I would have certain LARGE projects that would compile fast, and SMALL projects that compiled slow. Could not figure it out until I restructured the code and then I got the faster compile speeds. Lots of small units. It's easy to build monolithic units. But, there are many penalties from it.
I've heard it a 1000 times, develop on a slow machine like your users might be using. Hey, that's for the testing department. I can't waste time with compiling, Delphi load speeds, packages, etc. I went out and bought a "GAMERS" computer (WOW) with the Solid State Drives (as mentioned earlier), 12GB RAM, OVERCLOCKED "i7" Intel chip, triple video cards (linked), all on Vista64 (Vista is not bad once it is finally running with all installed parts). It was a real pain to get it all set up. But, I am not waiting anymore on my computer. Pure compile speed, load speed, plus a new fresh machine without all of the crap that was installed on the last one over the last 2 years. I even unloaded DelphiSpeedUp. Did not need it. And I don't need to turn off AntiVirus, since I did that one as well, and got penalized with the internet crap. So AntiVirus stays on. Pure and simple, get a BALLS OUT machine. Your time is worth more than what you will spend on a new computer.
Try to install a ram disk and set your dcu output path to point there. This more than halved my compilation time with Delphi 2007 on top of DelphiSpeedUp.
The compiler will only compile units that have changed. If you have changed the code in the interface section all units that depend on the changed unit are compiled. If only code in the implementation section is changed, the compile will only that unit but presumably link all the modules. Implies a good design of interfaces up front but if you restructure the code to restrict changes to the implementation compile times might reduce. I have no idea by how much. This fact is mentioned in the Delphi help files under Multiple and indirect unit references in Delphi 7 "Using Delphi".
Do not compile on network drives. Seek time is dramatically worse.
Consider pointing your dcu ("unit output" directory to a ramdrive.
Limit the number of include/unit directories.
Try to avoid minor circular references that the compiler still accepts, specially for large units (e.g. generated ORM units for your OPF). It might cause large units to be compiled twice. (does Delphi allow minor mutual circulars, or is that a FPC only feature?)
I never tried, but hardcoding all files with full/relative path in the central .dpr might also help (script to regenerate/update?). (you mention that above, but was it with path xx in '\path\yyy' notation?).
Other long shots:
Use Kylix (file/dir I/O under Linux is dramatically better in my experience (though that is from FPC experience)). Maybe we need a reversed cross-kylix :-)
Use a separate (windows) build machine, and tweak NTFS over the registry to be less "safe". (which you don't care for, since everything is a revision system to begin with). Afaik these options can only be done global for all filesystems, hence the separate system. Throw in a raid array or Raptor too.
Forget solid state. Nice buzz atm, but the high write ratio will kill it eventually (both life and performance when it gets fuller and can't optimally allocate anymore), and you need the expensive intel ones to beat two $75 HD's in RAID.
P.s. Sorry for the FPC references. I do both, and I sometimes don't know anymore what belongs to what.
What I do is always make sure to have very few directories in the library path, and most of the components and static code. I also make sure that NO sourcecode is available in the library path, only .dcu/.res etc. Only browsepath has the sourcecode, and special circumstances are handled through searchpath for the project.
Just limit what you compile in any situation.
A few years later I am struggling again with increasing compiling times. I am currently using Delphi XE4 and I am at a point where I absolutely need to refactor the units references. I thought about a new way to identify where are the problems:
I’m using Process Monitor from Microsoft/SysInternals to monitor the compiler:
I start Process Monitor with a filter to show only dcc32.exe
(or bds.exe when working from the IDE).
I build my project from the command line.
At the end I look at the CreateFile operations in the log of Process Monitor.
For each unit there will an entry for the .PAS file (when the compiler starts working on this unit) and one for the .DCU file (when the compiler is complexly done with this unit). By working on the log with a text editor and/or with Excel I can extract this kind of information:
A kind of “tree”, where you recursively see in which order the units have been compiled.
For each unit the delay between “.PAS file opened“ and “.DCU file written”.
Then I try to interpret the results to find places where doing some refactoring would speed the compile time. It is not so easy, but I’m getting some encouraging results.

Resources