How can I uses commands to check that the size of new file is significantly smaller than the one it's replacing?
I have a batch file running every night which, amongst other things, calls an application UpdVMem.exe which creates a cut down members database VMembers.Adt from the main file Members.Adt. This file is then moved to remotes sites with the same script.
On a few sites, at periodic intervals the VMembers.Adt will be corrupted. I have no idea why, as we've ruled out the table being locked for editing (by my Delphi Membership software). It will be often deceptively appear the same size but contains less than half the records.
Even better would be a set of commands which could detect this corruption or a failure in the execution of UpdVMem.exe, as the size alone is not the best indicator.
Thanks
You can use the ERRORLEVEL command to check the exit code of UpdVMem.exe in batch file. If the UpdVMem.exe is developed by you then you can use the Halt procedure to send "meaningful error codes" to the batch file (ie when you catch some exception when creating the VMembers.Adt file).
Related
I am using ShFileOperation to copy files to an SD card and it is working fine, almost!
I have some large files, 5GB and greater. When the SD card is empty this all progresses fine. But, when I am updating the files on the SD c ard, ShFileOperation will check remaining disk size and if the file is larger than free-space it will show a "No room" dialog and abort.
The problem arises when the file will be overwriting an existing one and is probably only 3MB or 4MB larger with new stuff. The ShFileOperation does not first check if the destination file exists before checking for disk space.
I have checked all available Flags on the MSDN site and the only one I can find is FOF_NOERRORUI but that is a little too brutal and totalitarian for me. Killing off all error messages just to overcome one problem.
Is there any way I can get ShFileOperation to not do that disk-space check, but still declare serious errors if they occur?
Thanks.
Is there any way I can get ShFileOperation to not do that disk-space check, but still declare serious errors if they occur?
You can use FOF_NOERRORUI to suppress the error UI. Which is indeed exactly what you want. But then you need to provide UI for any errors, since you asked the system not to. That flag essentially means, "let me take charge of reporting errors."
In this situation, I would suggest using CopyFileEx() for each file, utilizing its progress callback to update your own progress dialog as needed.
I am trying to use an application called CLUT.exe which is an old application for MS-DOS that can be used to reindex NTX files for DBF databases.
(This is not the main topic, but I am just writing this if someone wants to test the app and don't trust at all about the content).
The problem starts when trying to run the command line version through console (cmd.exe) and this error appears:
C:\>CLUT.exe [arg1] [arg2] [arg3]
run-time error R6009
- not enough space for environment
So, according to what I've searched, this could be a possible solution:
http://support.microsoft.com/default.aspx?scid=kb;en-us;230205
but it doesn't work and every alternative that I found to solve this over the internet is the same.
Another alternative could be to make right-click in the .exe file, go to Properties then Memory tab and increase the Initial environment memory from Auto to the max value but it doesn't work too.
Well, I am stuck and no "possible" solution is working for me. If someone is interested, knows more about this issue and want to test, you can download the application from here (click "Free Download" green button):
http://www.filebasket.com/free/Development-Clipper-programming-language/clut-exe/13996.html
or directly from my DropBox:
https://dl.dropbox.com/u/15208254/stackoverflow/clut_214.rar
Just to know, I am using Windows 7 and the CLUT.exe application is a Clipper based app (old programming language) that may run under windows console (cmd.exe).
Wikipedia does mention other dos emulators but, oddly, doesn't mention BOCHS.
Reindexing NTX files is not a difficult thing to do, and can be done with tools other than CLUT. For example, many of the utilities listed on this part of Download32 could be used. Otherwise, you could write your own using Harbour Project or xHarbour. Or contact me off list and I'll cook up something in Clipper 5.3.
LATER
If I read the README correctly for CLUT, it's a replacement for the DBU utility that comes with Clipper 5.x. I can supply you with a build of that if you're unsuccessful with other approaches.
I have two exe files and I want to run them at the same time. Is it possible to do it in delphi 7? I've searched it in the internet but I couldn't find any answer...
Iman said in a comment:
NO, I run two exe files, the first one take some files and produce an output for each file. the second exe run at the same time and wait for output of the first one, so when the first exe file is working on second input the second exe is working on first output of first exe :) something like pipeline
What I would do then, is start the 2nd program first. It will then be ready for the output of the first one as soon as the output is produced and there will be no delay.
Just execute one, and then the other. It's very difficult to make computers do anything at exactly the same time, a millisecond apart shouldn't kill you.
I have a project in Delphi7. Its is rather large consisting of 40 odd forms and frames.
Recently, the compiler only allows me to compile the project once so i can run it, then every re-compile the IDE hangs and i have to end the Delphi process. Before this occurs, my CPU goes to 50% (on dual core machine) so my deduction is the compilation process has gone into an infinite loop. The Executable it produces is not runnable and usually at a fixed size after it hangs.
I was wondering how i can go about finding where this inconsistency in my project is. Other projects do not suffer from this same issue.
You can use Process Explorer to discover what compiler is doing (reading a file, or ...).
Check the QC 3807 issue.
Check the system resources - free disk space, memory. Clean the temp folder. Check the disk for errors. Do you have antivirus running ? If yes, then try to turn it off.
Use "process of elimination", to see if it's something in your code.
First, make a backup of where you are, or save to your CVS (you ARE using version control, right? RIGHT? good.) Revert your branch to an earlier version where it worked. See if that works. If so, merge half of the changes from the present-day version. If that works, try the other half. Keep cutting things in half, and you'll find the code that causes the problem, by process of elimination.
Or, it may turn out to be something in the configuration. Carbonite may be your friend here.
You can either:
Enable "Compilation progress display" in the "Environment Options" window, in the "Preferences" tab.
Use the command line compiler bcc32.exe to have a detailed console output.
Both will let you know which file is hanging the compiler.
Take a look at the great Delphi Speed Up tool, which allows e.g. to abort CodeCompletion and HelpInsight by ESC/mouse move.
This question requires a bit of backstory... At my company, we produce a set of PDF and HTML files. A very large set. The current build process (which I designed, in haste) is a Perl script that reads a set of files, where each file contains a new ant command to execute.
It is designed terribly.
Now, I'm trying to shift the entire project over to using ant for the majority of the tasks. Within a target, I can construct a list of files that need to be built, as either PDF or HTML. However, when I call the ant command to build each file, after about three builds (of, say, five), the entire process crashes with an OutOfMemory error. Furthermore, my buildlog.xml ends up being something like 20 megs--it concatenates every ant command's output into one giant log, since they are being called from a single target. With the earlier Perl solution, I was able to get a buildlog.xml for each ant command--simply save and rename the buildlog to something else.
Even if I set ant or java heap sizes in my user.properties, I still fail with an OOM eventually. I wonder if an appropriate solution is to call <exec> to launch a script that does some of what I described and desire: namely, call ant, rename the buildlog, and die--theoretically allocating and freeing up space better than one "giant" ant call. I am worried that I am going to be heading down another "hacky" solution to a problem that's well-defined, and can be entirely confined to ant. Then again, <exec> does exist for a reason, so should I not feel bad for using it?
As with most corporate software (at least those which have deadlines and, if yours don't, please let me know where you work so I can try get a job there), the first step is to get it working.
Then, worry about getting it working well.
For that first step, you can use any tool at your disposal, no matter how ugly you think it looks.
But you might want to make sure that the powers-that-be know that you've had to do all sorts of kludgy things to get it working for them, so that they allow you to hopefully fix it up before maintenance has to start on it. You probably don't want to be maintaining a hideously ugly code base or design.
We've unleashed such wonders on the world as applications that shut themselves down nightly to avoid memory leaks (leaving the OS to restart them), putting "questionable" code at the other end of a TCP socket so their crashing doesn't bring down the main application and, I'm sure, many other horrors that my brain has decided to remove all trace of.