File operations conflicts - delphi

I’m writing a program which is continously looking for new files in a directory. After it extracts data from each file and makes some treatments with it, the files are moved to another directory containing all scanned files.
Imagine I’m copying a new file in the scanned directory while my program is running. Can a file which has not finished copying be treated (and then produce unforeseen results), or is it locked by the System ?
Now, imagine two instances of the program are running on two different computers, continously scanning the same folder. What can happen if both instances are trying to move the same file ?
Thank you for your help.

I have a project that does much the same thing. Another application is receiving data from a feed and writing files to a folder. My application is processing those files by opening them, acting on them in some way, writing them to another folder, then deleting them.
The strategy I used in the application that does the processing and deleting is to simply open them like this:
TFileStream.Create(AFileName, fmOpenRead OR fmShareDenyWrite);
If the file that is being opened is still being written by another process, the above will fail, and can likely be opened successfully on a subsequent iteration.

Related

Avoid reading the same file multiple times using Telegraf and file input plugin

I need to read csv files inside a folder. New csv files are generated every time a user submits a form. I'm using the "file" input plugin to read the data and send it to Influxdb. These steps are working fine.
The problem is that the same file is read multiple times every data collection interval. I was thinking of a solution where I could move the file that was read to a different folder, but I couldn't do that with Telegraf's "exec" output plug.
ps: I can't change the way csv files are generated.
Any ideas on how to avoid reading the same csv file multiple times?
As you discovered file input plugin is used to read entire files at each collection interval.
My suggestion is for you to instead use the directory monitor input plugin. This will read files in a directory, monitor the directory for new files, and parse the ones that have not already been picked up yet. There are some configuration settings in that plugin that make it easier to time when new files are read as well.
Another option is to use the tail input plugin which will tail a file and only read new updates to that file as things come. However, I think the directory monitor is more likely something you are after for your scenario.
Thanks!

How to unlock a file and delete it?

I have a program that uploads files to a ftp server. If upload is complete then I delete the files. Occasionally the files are not deleted and because the upload is automated I get an infinite loop where 100 copies of the file reach the server. If I try to manually delete the file from explorer I'm able to do it but for some reason the Deletefile command from my app doesn't work.
I've tried raising last OSError and I get nothing.
The files are stored on a mapped drive. Is there any workaround? If I upload 30 files to a ftp server sometimes one or two files cannot be deleted after loading them to the server.
Is there any way to close the file even if it is opened from another program? Something like unlocker does?
How can i see if my application is locking the file? I haven't implemented any locking mechanism inside the app.
Thanks.

How to avoid intermittent Errno::ETXTBSY exceptions?

During part of a request in a Rails application, I copy a directory from one place to another, think of it like a working area. Sometimes this copy operation results in "Errno::ETXTBSY" exceptions being thrown. I can't seem to pin down the case that causes it, any tips to detect the case or avoid it altogether?
I've made sure the destination directory is uniquely named, so it shouldn't be a case of 2 processes attempting to write to the same place. Beyond that I'm out of ideas.
ETXTBSY means that you're trying to open for writing a file which is currently being executed as a program, or that you're trying to execute a file which is currently open for writing. Since you say you're copying files, not executing them it seems likely it's the former, not the later.
You say you're targeting a unique new destination, but my guess is that's not entirely true and you're actually targeting an existing directory and one of the files you're attempting to overwrite is currently open as an executable text segment of a running process.
You haven't posted any code, so it's hard to comment specifically. I suggest you add enough logging so you know exactly what file(s) are being processed and specifically, the source and destination path that throws the exception. Then you could use lsof to see what process may have that file open.
One way to avoid the problem if you are overwriting a currently open executable, is to first unlink the target file. The running process will still have the old inode mapped and proceed merrily using the deleted file, but your open for write will then create a new file which won't conflict.

How to ensure that a file is correctly written to file system?

I hava an application that reads a file from a ZIP archive and saves it to file on file system. After writing it to file system I start start immediately to read this file with a SAX2 reader. On bigger files (300+ MB) it sometimes occures, that SAX2 stops parsing because of an unclosed tag. But when I check the file (or even try to read it again later) it works, so the file it self it OK.
FZipKit.ExtractToStream(LFileName, LStream);
LStream.SaveToFile(OutputFilename);
SAX2.processUrl(OutputFilename);
My assumption is, that the file was not yet fully written to file system when I started the parsing process.
Is there a way to ensure, that the file was written or the steam has been flushed to file system?
thx
I'm going to first of all assume that the XML parser operates correctly. If it is incapable of reading files, well the solution is obvious.
Which leads us to look at how the file is created. When you call SaveToFile, the file is opened, written, closed and buffers are flushed. In a plain vanilla system, your XML parser will see the entire content of the file. The only conclusion is that something is interfering. The most like suspect is your virus scanner. Many scanners, even the most respected ones, cannot properly handle a file being closed and then immediately re-opened.
The bottom line is that your code is fine and the problem almost certainly lies with your local environment.

Does Windows.CopyFile create a temporary local file while source and destination are network shares?

I have a D2007 application that uses Windows.CopyFile to copy MS Word and PowerPoint files from one network folder to another network folder. Our organization is migrating to Windows 7 from Vista. One of my migrated users got an error message that displayed a partial local folder (C:\Users\(username)\...\A100203.doc) during the copy. Does the CopyFile function cache a local copy of the document when it is copying from one network folder to another network folder or is it a direct write? I have never seen this error before and the application has been running for years on Win95, Win 98, Win2000, WinXP and Vista.
Windows.CopyFile does NOT cache the file on your hard drive... instead, it instructs Windows to handle the copying of the file itself (rather than you managing the streams in your own program). The output file buffer (destination) is opened, and the input buffer simply read and written. Essentially this means that the source file is spooled into system memory, then offloaded onto the destination... at no point is an additional cache file created (this would slow file copying down).
You need to provide more specific information about your error... such as either the text or an actual screenshot of the offending error message. This will allow people to provide more useful answers.
The user that launches the copy will require read access to the original and write access to the target, regardless of caching (if the user has read access to the file, then the file can be written to a local cache, so caching/no-caching is irrelevant).
It's basic security to disallow someone to be able to copy files/directories among machines just because the security attributes between the machines are compatible.
There's little else to say without the complete text of the error message.

Resources