I'm using swfupload to upload some excel files to my server. When the uploaded file is in the old (< 2003) format, everything works fine. I can upload the file, redownload it, and and confirm it is the same.
The problem is, whenever I upload a file in the new open xml format (> 2007). When I redownload that file and open it, I get an error:
Excel found unreadable content in 'filename'. Do you want to recover the contents of this workbook? If you trust the source of this workbook, click Yes.
I checked on the server and confirmed the same error is present there as well.
Additional info:
The files are stored on the server file system (not DB BLOBs)
If I "recover" the file, the contents appear to be exactly the same as the originals
This same system works fine for Excel < 2003 files and image files
I save the file on the server with File.WriteAllBytes(filePath, data) where data is generated by upload.InputStream.Read(data, 0, upload.ContentLength)
Related
I have an app what do querys from my sql database and i need to download or open files from the database. The files route are saved in the database so i need a way to download it or open in the app.The files type are pdf and jpg.
I want to upload lnk file (shortcut for large wmv file that exists in network directory) to test attachments. The problem is whenever I download this file from TFS it has extension of .download instead of .lnk and I have to change the extension manually in order to watch the video.
The issue is related to your browser. I've tested in IE and Chrome, IE works while Chrome doesn't. You could try with IE.
I am looking at using this file upload process in my asp.net MVC website. http://www.codeproject.com/Articles/830704/Gigabit-File-uploads-Over-HTTP?fid=1870992
It's working great so far. I can see the file being uploaded in chunks (on the server). Essentially they are placed in my upload directory in a directory that has the file name. In that directory, I see the chunks, such as "LargeFile.00000001.csv.tmp", "LargeFile.000000002.csv.tmp", etc....
If the upload is canceled I can still see the chunks on the server. Question: How can I "resume" the upload later on?
I could look at the folder name and file chunk names and decipher where I left off.
I have a program that uploads files to a ftp server. If upload is complete then I delete the files. Occasionally the files are not deleted and because the upload is automated I get an infinite loop where 100 copies of the file reach the server. If I try to manually delete the file from explorer I'm able to do it but for some reason the Deletefile command from my app doesn't work.
I've tried raising last OSError and I get nothing.
The files are stored on a mapped drive. Is there any workaround? If I upload 30 files to a ftp server sometimes one or two files cannot be deleted after loading them to the server.
Is there any way to close the file even if it is opened from another program? Something like unlocker does?
How can i see if my application is locking the file? I haven't implemented any locking mechanism inside the app.
Thanks.
I'm not very familiar with file handling in ruby. A problem I've come accross is that reading and writing a binary file doesn't produce exactly the same file.
clone = Tempfile.new(tempfile.original_filename)
FileUtils.copy_stream(tempfile, clone)
clone.flush
From the image below it is clear that it is not an exact file copy, when I try to open the newly created file in an image viewer it reports that the file is corrupt. I have tried copying the file in different ways such as clone.write(tempfile.read), etc. without success.
*The file viewer also indicates the original is ANSI Dos/Windows and the clone is ANSI Macintosh. The file size also differs by about 200 bytes.
What I'm trying to accomplish is actually simply using a Tempfile twice. A file is uploaded via rails and given to me as a Tempfile. I want to submit it to two different restful services and RestClient.post closes the file automatically. Another option would be to submit some sort of in memory stream clone to RestClient so that it can not close my file. If I submit File.open(tempfile.path) to RestClient it produces the same broken file, this indicates that the reading is the problem and not the writing. If I submit the original Tempfile object to RestClient it works perfectly but then it is closed and deleted and I cannot send it again.
Please help!
Regards,
Pierre
It would be much more helpful to see a hex view of these files instead of a text editor's intepretation. My guess is that at least one of the files is not opened in binary mode. In Ruby 1.9, try
open(filename, 'rb')
open(filename, 'wb')
Tempfile.new(filename, :binmode => true)
for opening a file for reading / writing and to create a binary temporary file, respectively.