C++ Builder 2009 Context Look Ahead Taking a Long Time - c++builder

I'm using C++ Builder 2009 and find the context look ahead (Ctrl + space) to be taking a very long time.
My dev machine has nice hardware, and is running Win 7. Eclipse's look ahead is very quick.
My builder environment does have a lot of 3rd party components installed, however it should not take 13 seconds to display suggestions.
I'm looking for tips/suggestions on how to speed this up.

Code insight is a feature that has a few problems. It is quite slow, particularly in older releases of Builder C++. Newer releases have gotten better, mostly due to the precompiler header wizard.
The trick to speedy code insight is to use precompiled headers. Have the pch include as much of the VCL and standard library as you can.
You won't get as fast as Eclipse, but it will get better. Other than that, upgrade to XE3 if you can.
If you are having issues as you type, increase the timeout of code insight from 0ms to 250ms, then it will not kill your flow.

Related

Should I be doing any periodic cleanup on my IDE (Embarcadero RAD studio)?

It seems like over time Embarcadero RAD Studio (I use XE4 Starter for programming Delphi) has gotten more and more sluggish. I know with other IDE's I've used sometimes you end up with bloat and detritus of temp files and cache files and other unneeded things the IDE generates, and periodically it can help to manually go in and clean those things up.
Are there any particular file removal, or other IDE maintenance a programmer should do periodically to keep RAD Studio running smoothly at peak efficiency?
There are really many angles to how the IDE performance is affected. And there's essentially two levels to look at it on: What's installed in the IDE, and what's required by your immediate project.
IDE Maintenance
There's not necessarily anything you should have to do on a regular basis to your IDE. However, you should make sure that your library path is up-to-date. This is one of the most common things I ever need to manage in the IDE.
Then there's the third-party libraries, components, add-ons and fix-packs that you can install onto the IDE. Each one of these has the possibility of slowing things down, so make sure you only have things installed that you actually use.
Project Maintenance
Performance of the IDE isn't usually hindered by maintenance of a project. However, a very large complex project (or project group) may wind up slowing it down. Cleaning of the DCU files of a project shouldn't be necessary on any regular basis, but can often help clean up and forcefully re-compile anything which the IDE may have neglected to keep up to date. I've seen issues which were solved by deleting DCU files and recompiling them. Remember, DCU files are essentially compiled code, and is mostly for the purpose of caching the compilation so that it only needs to compile those units which have changed since the last compilation. So, cleaning up the DCU's will result in the next compilation taking a little longer than usual. Still, not much worth the pain.
Conclusion
There's a batch file I use commonly to clean up unnecessary files in different project folders. That batch file is scripted like below, and simply deletes any temporary files I don't need to keep around. But it's only needed on rare occasions when I'm cleaning up the source, and not really on any schedule.
del *.dcu
del *.dof
del *.dsk
del *.identcache
del *.local
del *.~*
del *.cfg
del *.dsm
del *.rsm
del *.otares
Note that the same or similar can be accomplished by choosing the "Clean" option in the IDE for any given open project.
Edit
You can also accomplish best IDE performance by keeping track of how your project interacts with the IDE. For example, components or controls which perform heavy actions (such as database connections with data aware controls).
OP hasn't characterized the nature of HOW the IDE "has gotten more and more sluggish," so anything anybody says here is little more than random speculation.
First off, I've never used the Starter Edition of Delphi. Aside from that, I've never noticed any particular slow-down on different versions, other than what you'd expect from normal stuff. That is, if you load bigger files, it can take longer to do stuff in the IDE. More installed libs and components take longer to load at startup, but impose negligible impact inside the designer. Deleting files like those above can slow down builds because the compiler needs to reproduce them; but if you're not using that project, then there shouldn't be any impact whether they're present or not. And poorly designed components can take a toll on IDE performance.
It's entirely possible there may be some optimizations turned off in the Starter Edition, or things missing that might result in performance hits over time. It's not as "feature complete" as the full IDE, although I've never investigated the differences. I just stick to full released products because I use them for production work.
Also, maybe it's time you upgraded your computer. You could be dealing with limitations in virtual and physical memory that have nothing to do with Delphi, which is a rather large app in its own right. I run Delphi inside of a VMWare VM on a MacBook Pro, and it works fine unless the memory gets tight, then it can slow way down. (Again, without more specifics, it could really be tons of different things.)

Signal Processing Algorithm Psola or Wsola in Delphi

I'm trying a long-time to find algorithm PSOLA (Pitch Synchronous Overlap Add) or WSOLA (Waveform Similarity Overlap Add) which are Acoustic or Signal Processing Algorithms.
I found it in c++, but I have no experience in c++ and is difficult to pass it to Pascal. Anyone have this code in Pascal or know where to find it to copy?
Something like this example that is in c++
http://sourceforge.net/projects/mffmtimescale/files/v3%20stable/v3.9/WSOLA.v.3.9.zip/download?use_mirror=ufpr
Try the SoundTouch DLL, it comes with a Delphi import unit, so you can use the DLL directly. It should not be too hard to compile it. Just download the free VC++ Express from the MS download site and compile it with that or ask someone with VC++ to compile it for you.
FWIW, who knows, with a few modifications, it might also compile with C++Builder.
About one decade ago, I've used praat in FSeqEdit (Delphi program) to do these type of calculations, but I think the same approach would still work fine today.
I wrote some praat-scripts and execute them via praatcons.exe (console version of praat). You can download the console version from this page:
http://www.fon.hum.uva.nl/praat/download_win.html
That works pretty good.
I usually take this approach:
I manually check what type of conversions and calculations need to be done via the GUI version of praat. Once I find what I need, I create a script for it, and run that with the console version.
Praat is very powerful, so if you didn't know about it yet, make sure to check it out.
There's a page that shows how to work with PSOLA resynthesis here:
http://www.ling.ohio-state.edu/~kyoon/praat-tut/praat-tut2.html
Let me know if you want to see some example code on how to integrate it into your Delphi application (it's pretty straightforward actually).

How can I load a package and keep the debugger working?

I'm using TJvPluginManager in the JVCL to create and load BPL-based plugins for my program. Problem is, one of the plugins isn't loading properly, and I can't debug it. Every time I try to trace into the loading sequence, it gets as far as the LoadLibrary API call, and then the debugger seems to forget what it's there for. It completely loses the ability to associate program code with source lines, give meaningful data in a call stack, or display local variables. It will still stop at breakpoints, but it breaks to the CPU window, with all the inline source code stripped out.
This happens on Delphi 2007 and 2009, and it's driving me nuts. Does anyone know how to load a plugin without it breaking the debugger? Does anyone even know why it's breaking it in the first place?
NOTE: I'm not looking for alternative methods of debugging. I know all about tracing and logging and all the rest. What I want is to understand what's going wrong and how to fix it. Surely I'm not the only person who's ever used TJvPluginManager?
Not quite the answer to your question: Have you tried to debug the package project, by setting the host application and putting a breakpoint into the package's startup code?
I've found Ray Kanopka's (Raize) CodeSite to be invaluable for debugging in situations where the integrated debugger is acting up. Thinking about the things I want to monitor using CodeSite actually helps me focus on what's important - it enforces good habits.
Another alternative to Codesite is Overseer which is part of the nexus project, but stands alone so does not require you to use their framework. Codesite is by far the better option, but in a pinch Overseer would work just as well.
I found that using packages for plugins can be problematic and many years ago switched to a completely COM based implementation for plugins and never had any problems. The other advantage to COM based plugins, they don't require Delphi to write, do not need to be recompiled when the main app switches to a new version of the compiler (my plugins compiled with Delphi 5 still run fine against the main application compiled in Delphi 2009!) and they are easier to write test applications to assist in debugging.
The only side effect I notice, is that shared code ends up in both executables and the plugins require registration into the registry.
Hmmmm... This is a stupid question, but I have to ask: the initialization function have the EXACT declaration syntax like the other plugins that work ?(from your question, I deducted you made some others that work)
Check your dependencies. Make sure each unit is compiled into one package only. Whenever a package needs to reference a unit from another package, use the requires clause to do so. Watch for compiler warnings about implicitly linked units.

Delphi: How to organize source code to increase compiler performance?

I'm working on a large delphi 6 project with quite a lot of dependancies. It takes several minutes to compile the whole project. The recompilation after a few changes is sometimes much more longer so that it is quicker to terminate Delphi, erase all dcu files and recompile everything.
Does anyone know a way to identify, what makes the compiler slower and slower? Any tips how to organize the code to improve compiler performance?
I have already tried following things:
Explicitly include most of the units in the dpr instead of relying on the search path: It didn't improve anything.
Use the command line compiler dcc32: it isn't faster.
Try to see what the compiler does (using ProcessExplorer from SysInternals): apparently it runs most of the time a function called 'KibitzGetOverloads'. But I can't do anything with this information...
EDIT, Summary of the answers until now:
The answer that worked best in my case:
The function "Clean unused units references" from cnpack. It almost automatically cleaned more than 1000 references, making a "cold" compilation about twice faster. ("cold" compilation = erase all dcu files before compiling). It gets the reference list from the compiler. So if you have some {$IFDEF } check that all your configurations still compile.
The next thing I would like to try:
Refactoring the unit references manually (eventually using an abstract class)
but it is much more work, since I first need to identify where the problems are. Some tools that might help:
GExperts adds a project dependencies browser to the delphi IDE (but unfortunately it can not show the size of each branch)
Delphi Unit Dependency Viewer V1.0 do about the same thing but without Delphi. It can calculate some simple statistics (Which units is the most referenced, ...)
Icarus which is referenced on a link in one of the answer.
Things that didn't change anything in my case:
Putting every files from my program and all components in one folder without subfolders.
Defragmenting the disk (I tried with a ramdisk)
Using a ramdisk for the code source and output folders.
Turning off the live scanning antivirus
Listing all the units in the dpr file instead of relying on the search path.
Using the command line compiler dcc32 or ecc32.
Things that didn't apply to my case:
Avoiding having dependencies on network shares.
Using DelphiSpeedUp, because I already had it.
Using a single folder for all dcu (I always do it)
Things that I didn't try:
Upgrading to another Delphi version.
Using dcc32speed.exe
Using a solid-state drive (I didn't tried it, but I tried with a ramdisk where I put all the source code. But maybe I should have installed delphi on the ramdisk too)
Some things that could slow down the compiler
Redundant units in your uses clause. See this question for a link to CnPack.
Not explicitly adding units to your project file. You've already seem to have covered that.
Changed compiler settings, most notably include TDD32 info.
Try to get rid of unused units in your uses clause and see if it makes a difference.
using Delphi 7 and 2009, last week I pass from almost 2 minutes for compiling and another 45 seconds from hitting f9 and get the main form of my app to 20 seconds compiling and running. This things has drive me crazy for about 6 months and nothing I tried seems to work. Using filemon from SysInternals, I realize than every unit (mostly components) that compiler requires was searched in every folder that was in Search Path, yes, this produce a LOT of FileOpen, FileExists and FileNotFound, etc. What I do was, put every DCU, DFM, RES, etc from components all in a single folder, and having just this folder in the search path, and a couple of others folders required by the project; the results were amazing. Other problem prior to the fix, was debugging. It takes almos 40 seconds in each F7, F8 key press while debuging, this has been fixed too. Hope this info can help you. Greetings form Isla de Margarita, Venezuela. Excuse my english, if any error ;)
Check are there any paths in search paths that aren't on your local machine.
i.e. Don't link to binaries on network shares, and check that the search path isn't checking any network shares.
I haven't seen the compiler get slower over time, but it's been a long time since we used Delphi 6.
It seems to be generally agreed upon in the Delphi community that, if you don't want to upgrade to the latest and greatest (Delphi 2007 or 2009), then Delphi 7 is the best/fastest/most stable. You might consider upgrading.
KibitzGetOverloads sounds like something from the kibitz compiler -- the "background" compiler that gives you code-completion, background error highlighting, code tooltips, etc. Sounds like you'd be better off checking the call stack of the command-line compiler, not the IDE; you'd get something more helpful.
I have never found compiles to be faster after deleting the DCUs. DCUs are there to make the build incremental, therefore faster. If you're seeing faster compiles after deleting all DCUs, check your hardware. Have you defragged your hard disk lately? How much free space do you have on the drive?
Have you set a single folder to get the DCUs. If not, they will be scattered all over.
Put all the units and their implicitly called units (except installed components from Library path) in the dpr. To be sure you did not miss some, empty your search path, it still should compile.
After reducing the search path, you can try to reduce your library path by installing your components into fewer folders.
Although only partly relevant to your exact question, I hear that the use of a solid-state drive is vastly increasing compile time with Delphi - Nick Hodges said this himself on the Delphi Podcast a couple of week ago.
Brian
U can automatically get rid of
unnecesseary unit references, which is very efficient optimization for compiling speed.
In your situation, dividing your
project into packages can improve
compiling speed. With this way, it
just generates modified package(s),
not single massive binary for each
recompilation. Working with packages
can also help about easy deployment
of your project updates.
Turn off your live scanning antivirus
We had the same (or similar) problem.
I of our package has compilation Time about 12 min.
After changes, now we have moved to 32 sg.
After many tests we found that the "problematic situation" was the following:
In a single package:
The A unit uses a large number of units: U1, U2, U3, U4, ... U100 (Uses of Interface) in the same package. This is an important unit that centralizes all the initialization work.
All units of the package, U1, U2, U3, .., U100 uses unit A (use of implementation)
This "circular reference" does not give compilation errors because the USES are different, but caused a large compile-time.
SOLUTION:
Eliminate the reference to each unit, U1, U2, U3 ,...., U100 in the A Unit.
Now, A unit use a large number of units: U1, U2 ,...., U100, but the units U1, U2 ,..., U100, does not use the unit A.
After this change the compile-time is down drastically.
If you have a similar situation, you can try this.
Excuse for my bad english.
Greetings.
Neftalí -Germán Estévez-
I had the same problem and I can come up with (2) reasons it effected me.
Circular references. The gentleman who stated that one was correct. I would have certain LARGE projects that would compile fast, and SMALL projects that compiled slow. Could not figure it out until I restructured the code and then I got the faster compile speeds. Lots of small units. It's easy to build monolithic units. But, there are many penalties from it.
I've heard it a 1000 times, develop on a slow machine like your users might be using. Hey, that's for the testing department. I can't waste time with compiling, Delphi load speeds, packages, etc. I went out and bought a "GAMERS" computer (WOW) with the Solid State Drives (as mentioned earlier), 12GB RAM, OVERCLOCKED "i7" Intel chip, triple video cards (linked), all on Vista64 (Vista is not bad once it is finally running with all installed parts). It was a real pain to get it all set up. But, I am not waiting anymore on my computer. Pure compile speed, load speed, plus a new fresh machine without all of the crap that was installed on the last one over the last 2 years. I even unloaded DelphiSpeedUp. Did not need it. And I don't need to turn off AntiVirus, since I did that one as well, and got penalized with the internet crap. So AntiVirus stays on. Pure and simple, get a BALLS OUT machine. Your time is worth more than what you will spend on a new computer.
Try to install a ram disk and set your dcu output path to point there. This more than halved my compilation time with Delphi 2007 on top of DelphiSpeedUp.
The compiler will only compile units that have changed. If you have changed the code in the interface section all units that depend on the changed unit are compiled. If only code in the implementation section is changed, the compile will only that unit but presumably link all the modules. Implies a good design of interfaces up front but if you restructure the code to restrict changes to the implementation compile times might reduce. I have no idea by how much. This fact is mentioned in the Delphi help files under Multiple and indirect unit references in Delphi 7 "Using Delphi".
Do not compile on network drives. Seek time is dramatically worse.
Consider pointing your dcu ("unit output" directory to a ramdrive.
Limit the number of include/unit directories.
Try to avoid minor circular references that the compiler still accepts, specially for large units (e.g. generated ORM units for your OPF). It might cause large units to be compiled twice. (does Delphi allow minor mutual circulars, or is that a FPC only feature?)
I never tried, but hardcoding all files with full/relative path in the central .dpr might also help (script to regenerate/update?). (you mention that above, but was it with path xx in '\path\yyy' notation?).
Other long shots:
Use Kylix (file/dir I/O under Linux is dramatically better in my experience (though that is from FPC experience)). Maybe we need a reversed cross-kylix :-)
Use a separate (windows) build machine, and tweak NTFS over the registry to be less "safe". (which you don't care for, since everything is a revision system to begin with). Afaik these options can only be done global for all filesystems, hence the separate system. Throw in a raid array or Raptor too.
Forget solid state. Nice buzz atm, but the high write ratio will kill it eventually (both life and performance when it gets fuller and can't optimally allocate anymore), and you need the expensive intel ones to beat two $75 HD's in RAID.
P.s. Sorry for the FPC references. I do both, and I sometimes don't know anymore what belongs to what.
What I do is always make sure to have very few directories in the library path, and most of the components and static code. I also make sure that NO sourcecode is available in the library path, only .dcu/.res etc. Only browsepath has the sourcecode, and special circumstances are handled through searchpath for the project.
Just limit what you compile in any situation.
A few years later I am struggling again with increasing compiling times. I am currently using Delphi XE4 and I am at a point where I absolutely need to refactor the units references. I thought about a new way to identify where are the problems:
I’m using Process Monitor from Microsoft/SysInternals to monitor the compiler:
I start Process Monitor with a filter to show only dcc32.exe
(or bds.exe when working from the IDE).
I build my project from the command line.
At the end I look at the CreateFile operations in the log of Process Monitor.
For each unit there will an entry for the .PAS file (when the compiler starts working on this unit) and one for the .DCU file (when the compiler is complexly done with this unit). By working on the log with a text editor and/or with Excel I can extract this kind of information:
A kind of “tree”, where you recursively see in which order the units have been compiled.
For each unit the delay between “.PAS file opened“ and “.DCU file written”.
Then I try to interpret the results to find places where doing some refactoring would speed the compile time. It is not so easy, but I’m getting some encouraging results.

Cross-platform development - Delphi 2011: How to made a Windows-tied library cross-platform?

As perhaps you know already, most probably the next version of Delphi will be cross-platform. Also, here are some polls on the matter.
While writing a cross-compiler isn't a thing which interests us very much now, porting a library which was/is Windows-tied to multiple platforms, certainly does.
You can think, for example at VCL (Delphi's standard library). While it was designed for Windows only, it has value in it, and, of course, there are huge codebases which depend on it.
The question is:
Which would be the best approach to made an application / library cross-platform aware ensuring a smooth conversion / upgrade path (as much as possible of course)?
I stress it again, we are not interested which is the best way to do cross-platform development only (there were questions on this theme). We are interested also in yet another requirement: The old code base / installations management.
PS: Experiences and/or methodologies from similar situations with other languages (eg. C/C++) which are regarded as standard practices are welcomed.
Thanks in advance.
Visual component developer's perspective:
Add levels of functionality to your code, so as to be able to add another platform without changing the "Core" of the component.
The compiler hopefully will have a platform switch. (Preferable more than one, working in conjunction with each other. ex. Windows/ARM, Windows/386, OSX/Cacao/386, Linux/Gnome/386).
The Layout structure might look something like this.
ComponentJ.pas
Linux\ComponentJ.pas
Linux\Gnome\ComponentJ.pas
Linux\KDE\ComponentJ.pas
OSX\ComponentJ.pas
386\ComponentJ.pas
ARM\ComponentJ.pas
As an Application Developer:
I'll start by moving all WIN API calls in my code into a group of libraries in a Windows directory as to be able to IFDEF it at library level and translate it into another platform I'd like to support as soon as the compiler becomes available, but only as I come across them.)
This will also add the possibility to add adapters easier for the new platforms.
It in any case is good practice to remove possible dependencies into a central place.
IMHO you can't build a xplatform Delphi and ensure a smooth transition for current VCL applications. It won't work. VCL was (luckily, because it allowed for great applications) designed with Windows in mind, and trying to design a compatible library working on a different platform would just mean longer development cycles and lots of compromises. The outcome will be a library noone would wish to use. Look at what happened to VCL.NET: it was the wrong choice. And it was working on the same OS!
We know that targeting non-Windows platform with native applications needs a native GUI library. We don't care about creating a GUI from scratch, for our application it's the way to go, we don't need Windows GUIs with all their standards under a different OS using different standards - we need to be able to code a fully native GUI for the target OS.
Other applications may survive a GUI porting, but in the long run you don't get a real xplatform tool - you get a tool that may compile for other platforms but brings one platform paradigms to others - and it will also be not welcome by "native" developers on other platforms. If you're a Linux or Mac developer, why should you learn how to work with a library that carries its Windows inheritance to your platform? You'd find it a pain in the ass. If Embarcadero wants to sell XDelphi outside actual developers base, it has to offer much more than a new CLX.
I will pull from some ancient experience in making a code base cross compilable between windows and dos (Delphi 1/Turbo Pascal 7). The rule of thumb was to separate code into multiple units. Try to code WITHOUT using windows, messages or any visual components. If you find you need to make a call to one of these, then place that call in another unit and write a proxy (abstract class that you descend from works well) to dispatch the calls through. When a cross compatible version is released, all that you should have to do is code the other side of the proxy for the new target.
If you're designing a form based system, then try to stick with as many of the standard components as possible. NEVER implement any "business" rules directly in an event, instead place them in another unit and call into the other unit to perform the logic.
Now, there will definitely be changes required to get your final project cross compatible, but by following these simple patterns you should be able to greatly reduce the amount of work it will take.
Experience so far has shown that the best way to get a Delphi app compatible with future versions is to stick to pure Delphi components, and use nothing third party. Such an app will probably suck, but that's how it seems to me. I use lots of third party components, and the apps are great and successful. But the chances of them moving to this future too are not certain, and that may cause problems with such changes, but I'd rather have a great app now and have the problem than have a poor app now and not need to worry about it.
Compromises should not be done too much to make VCL compatible with Linux and Mac. Windows is VCL's root. I'll prefer a new and very clean GUI framework, even though without any backward compatibility. Make VCL fatter and fatter isn't a good idea!
make a cross-platform Pascal compiler
make a cross-platform RTL
put the QT on top
Well, look at freepascal and lazarus
I don't get it. All .NET looks the same to me providing we don't use any third party.
Delphi using standard control is already fully functional but your app would look
like thousands of others.
I think Embar should go for PDA, IPhone, Andriod as Windows desktop already eat about
98% of the market.
Mac is expensive and Linux is no cost at all. No use to go for Mac and Linux. Not worth
the investment.
Well, aside the things said - thanks all - I do think that there we need some additional things:
we need tooling to do the necessary conversions
we need tooling to help us in programming against a (some form of) MVC pattern
Simply pick the latest 4.6 QT and add good integration betwen the Pascal and the QT library.
They have done it before (in the Kylix times). The QT is such powerfull these days.
I believe that QT is even better then VCL and at least 10 times more frequently updated and fixed.
So the plan is simple:
make a cross-platform Pascal compiler
make a cross-platform RTL
put the QT on top
and you will have a first-class natively looking applications on all platforms.
My opinion:
Make cross platform compiler (OS x/Linux/ embedded solutions?/ symbian?). Maybe add ability to compile/convert pascal code into portable c/c++ code to build then on embedded platforms.
RTL have to be separated into cross-platform layer and native layer (as for JCL).
Add new core components for cross-platform compatibility and native components for each supported platform (QT for ex)
Add translation utilities to create/convert between platform's components, for ex: to convert pure windows form into mac os x cocoa's form.
All windows hierarchy of components have to be only upgraded to support x64 with maximum backward compatibility. All cross-platform component have to be in parallel hierarchy.
Next version of cross platform solution can be refactored and can include migration/convertion utility. Due to minimum codebase of cross-platform solutions, hierarchy and classes for cross platform can be heavily changed from version to version to achieve best architecture.
sorry for my English - not a native language (Russian is)!
Make C/C++/Delphi compilers that targets OSX/Linux
Make C/C++ compiler that can be Boosted
Write new VCL-Presentation Foundation (VGScene/WPF alike)
it should not be backward compatibile! Delphi IDE should be
written with such VCL-PF
Component Library should stay as it is (but with improved Data-Binding)
Only provide VCL 64-bit for Win64
Is this a problem?

Resources