dotnetcore using TraceListener -- need to flush and close then reopen and append - logfile

I'm writing Trace output to a file, but I can't leave it open because the system never hits a point where I am sure everything is finished; i.e., it hangs.
I don't want to keep opening new files with every TRACE.
Can't I reopen the TraceListener so I can append to the file?
Much appreciated.
Chuck

ANSWER: it turns out that Tracing can be done easier than I thought. I wrote a little C# application so I could try every combination of uses.
Once you open the listener in the application, you can write to it at will. Flushing empties buffers (in my case I write to a file) and the output all goes into the file.
Now you can CLOSE the file and at that point you can either OPEN it again to the same place with the same name etc (WITHOUT losing anything in the file) or you can just go ahead the write to it. Again, you can flush if you like and it is all appended to the file. OR you can Close again and it is appended to the file (Closing automatically does a flush).
It certainly answers all my questions. I hope it does yours as well.

Related

In PHPstorm , is any way to avoid save all open documents instead current active document? [duplicate]

My traditional workflow must be a little different to the PHPStorm default. I often work on multiple files at the same time and want to be able to save just one file when I've finished with it, without saving the others that I've modified.
I've managed to turn off the auto-save feature. Now, when I edit files I get stars on the ones I've edited and they stay like that until I hit 'save'. So far so good.
But when press CTRL-S to save, expecting it to save the one file I'm looking at so I can go back to the ones with asterisks to polish them off too, it also saves ALL the other files too.
I hope there's some way to change this behaviour or set up something to allow me to save just one file at a time!
Yes, you can .. but that still does not change a lot (e.g if you change your settings, or run/re-run something -- all files will be saved automatically anyway). Eventually (after few weeks or month of adaptation) you will get used to this behaviour and quite likely will love it (yes, this means changing working habits a bit, which is quite hard to do (requires time) for some people/in some cases).
Anyway ... to enable "save single file" functionality:
Settings | Keymap
On that screen, in search box type "save"
The action you are after is called "Other | Save Document"
Assign whatever shortcut you want.
P.S.
This action will NOT ask for confirmation (same behaviour as standard save does).
P.P.S.
This action is available since PhpStorm v7 ONLY.

How to get a content of file which is being written by an application?

This application always create a file when you activate a function (lets say, a log file). This file cannot be opened during the running - but I need its content before application closes (another process uses it, so I cant even view it). Is there a way to "hook" it somehow?
Im working with Delphi, but I accept any other solution.
So, summary, I need to know what file application created (it always creates other, but in the same directory) and the content it wrote. Any help appreciated.
I found a workaround:
copy the file, and operate on the cloned one:
http://www.howtogeek.com/howto/windows-vista/backupcopy-files-that-are-in-use-or-locked-in-windows/

How to avoid intermittent Errno::ETXTBSY exceptions?

During part of a request in a Rails application, I copy a directory from one place to another, think of it like a working area. Sometimes this copy operation results in "Errno::ETXTBSY" exceptions being thrown. I can't seem to pin down the case that causes it, any tips to detect the case or avoid it altogether?
I've made sure the destination directory is uniquely named, so it shouldn't be a case of 2 processes attempting to write to the same place. Beyond that I'm out of ideas.
ETXTBSY means that you're trying to open for writing a file which is currently being executed as a program, or that you're trying to execute a file which is currently open for writing. Since you say you're copying files, not executing them it seems likely it's the former, not the later.
You say you're targeting a unique new destination, but my guess is that's not entirely true and you're actually targeting an existing directory and one of the files you're attempting to overwrite is currently open as an executable text segment of a running process.
You haven't posted any code, so it's hard to comment specifically. I suggest you add enough logging so you know exactly what file(s) are being processed and specifically, the source and destination path that throws the exception. Then you could use lsof to see what process may have that file open.
One way to avoid the problem if you are overwriting a currently open executable, is to first unlink the target file. The running process will still have the old inode mapped and proceed merrily using the deleted file, but your open for write will then create a new file which won't conflict.

How to ensure that a file is correctly written to file system?

I hava an application that reads a file from a ZIP archive and saves it to file on file system. After writing it to file system I start start immediately to read this file with a SAX2 reader. On bigger files (300+ MB) it sometimes occures, that SAX2 stops parsing because of an unclosed tag. But when I check the file (or even try to read it again later) it works, so the file it self it OK.
FZipKit.ExtractToStream(LFileName, LStream);
LStream.SaveToFile(OutputFilename);
SAX2.processUrl(OutputFilename);
My assumption is, that the file was not yet fully written to file system when I started the parsing process.
Is there a way to ensure, that the file was written or the steam has been flushed to file system?
thx
I'm going to first of all assume that the XML parser operates correctly. If it is incapable of reading files, well the solution is obvious.
Which leads us to look at how the file is created. When you call SaveToFile, the file is opened, written, closed and buffers are flushed. In a plain vanilla system, your XML parser will see the entire content of the file. The only conclusion is that something is interfering. The most like suspect is your virus scanner. Many scanners, even the most respected ones, cannot properly handle a file being closed and then immediately re-opened.
The bottom line is that your code is fine and the problem almost certainly lies with your local environment.

Need help opening printer spool shadow file (.SHD) that is locked

I'm interested in some information inside a shadow file (.shd) located inside the windows print spooling directory "C:\Windows\System32\spool\PRINTERS". Every time a print job is started, a spool file (.spl) and a shadow file (.shd) are created in that directory. So far I have been successful in detecting when a print job has started, and have been able to pause that print job. If you don't pause the job, the files eventually make their way to the printer and then are deleted by windows.
My problem is. I cannot open the .SHD files because they are locked in such a way that you can not read them while they are open by the sprint spooler. I've even tried going to the file in windows explorer and simply copying the file to another file, and that didn't work either. The .SPL spool files I can open though. I simply wait, and fairly quickly the spooler release that file. For the shadow file though, it permanently holds on to this file. Unfortunately, its the one I need.
The line of code I'm using specifically to open the file is as follows:
m_spoolJobStream = new FileStream(spoolFilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
The IOException I get is:
The process cannot access the file 'C:\Windows\system32\spool\PRINTERS\FP00083.SHD' because it is being used by another process.
So yes, it is being used by another process. Its being used by the window's print spooler service. But I don't think there is anything I can do about that. All I want to do is read the file. I don't want to make any changes to it. Is there anything I can do here or am I just screwed?
Check the option: "Keep printed documents" (if you have HP printer) and then see your spool file folder, both shadow and spool files would be there.
Well, I did not find a way around this problem. I suspect there is no solution for this and it is by design. However I did find another way to get the information I wanted (at least it seems so thus far).
I'm using the FindNextPrinterChangeNotification() routine out of the winspool.drv library. This guy returns a pointer to a PRINTER_NOTIFY_INFO structure, which in turn contains an array of PRINTER_NOTIFY_INFO_DATA structures. Within that array, there is an element with its "Field" member marked as "JOB_NOTIFY_FIELD_DEVMODE". This element contains a fairly large structure of type DEVMODE. The structure is explained by M$ here http://msdn.microsoft.com/en-us/library/dd183565%28v=vs.85%29.aspx . This structure looks like it contains what I'm looking for and apparently is wrapped up in the .SHD file anyways according to this page http://www.undocprint.org/formats/winspool/shd. I'd like to know what else is in that .SHD file, but I still can't open it because its locked while the job is paused, and I suspect that it stays locked until the job is complete. Oh well, I think my new solution is more elegant anyways.
Just make sure you pause the job in the spool on BOTH your box and the server, then you should be able to copy/open/move the shd file just like you can the spl file. Worked for me, anyway...
This works for me:
- Hang your printer (e.g. jam the paper)
- Print and observe .SHD and .SPL being created
- Stop Print Spooler
- Open the file
The problem might be the FileShare.ReadWrite parameter. You're asking to read and write on the file and maybe that's why you get an error. You should try asking for read-only permission.

Resources