I'm creating a system where I need to upload a file in 80 MB chunks total file size is 4 GB, I'm using jquery and PHP, But I don't really understand how I can create chunks and send them to PHP uploader. also, I have other form data to send with the file. I have tried using "plupload" and I cannot get it to work.
Related
I am sending a file from my rails server to a microcontroller. The microcontroller is running out of memory because (we believe) the file is being sent in chunks that are too large - up to 16 kb at a time.
How can I take the StringIO object I have from S3 and send it to the requestor in 4kb chunks?
My current implementation:
file_name = "#{version}.zip"
firmware_file = s3(file_name).get()
response.headers['Content-Length'] = firmware_file.body.size.to_s
send_data firmware_file.body.read, filename: file_name
Rails has ActionController::Live module which helps you stream response in real time. In your case, as you want to create smaller chunks and send it to client (microcontroller), this feature might be useful.
"File System Monitoring" section of the article Is It Live? by aaron patterson explains how change in file system can be pushed to client in real time with ActionController::Live.
Hope this helps!
In short, you need to exploit ActionController::Live in order to stream response data to your client(s).
Since you are transferring zipfiles you can use the elegantly simple zipline gem. What I particularly like about this gem is that it supports a large number of streamable object types - so just about anything you can think of to throw at it, it will figure out how to stream it without much effort on your part.
Working with Grails 3.* Framework.
Iam trying to upload a video file, the size will be grater than 10 MB.
Below is the code, which works perfectly for storing image files in server, which is uploaded from client browser
File fileIns = new File("/opt/a.jpg")
fileIns << params.image
Can I use the same code for saving videos.
if so, params.video may consume a huge memory, how can I optimize it.
Or any other methods to stream part by part and save it as video?
How whatsapp and telegram are used for transferring video files into the server?
Any suggestions is appreciated.
If You are dealing with large data, I'd suggest going full java.
Check this link about using Streaming.
Another chance is using HTTPClient. Read this
I have a question about how to efficiently store and retrieve large amounts of data to and from a blob column (data_type :binary). Most examples and code out there show simple assignments but that cannot be efficient for large amounts of data. For instance storing data from a file may be something like this:
# assume a model MyFileStore has a column blob_content :binary
my_db_rec = MyFileStore.new
File.open("#{Rails.root}/test/fixtures/alargefile.txt", "rb") do |f|
my_db_rec.blob_content = f.read
end
my_db_rec.save
Clearly this would read the entire file content into memory before saving it to the database. This cannot be the only way you can save blobs. For instance, in Java and in .Net there are ways to stream to and from a blob column so you are not pulling every thing into memory (see Similar Questions to the right). Is there something similar in rails? Or are we limited to only small chunks of data being stored in blobs when it comes to Rails applications.
If this is Rails 4 you can use render stream. Here's an example Rails 4 Streaming
I would ask though what database you're using, and if it might be better to store the files in a filesystem (Amazon s3, Google Cloud Storage, etc..) as this can greatly affect your ability to manage blobs. Microsoft, for example, has this recommendation: To Blob or Not to Blob
Uploading is generally done through forms, all at once or multi-part. Multi-part chunks the data so you can upload larger files with more confidence. The chunks are reassembled and stored in whatever database field (and type) you have defined in your model.
Downloads can be streamed. There is a large tendency to hand off upload and streaming to third party cloud storage systems like amazon s3. This drastically reduces the burden on rails. You can also hand off upload duties to your web server. All modern web servers have a way to stream files from a user. Doing this avoids memory issues as only the currently uploading chunk is in memory at any give time. The web server should also be able to notify your app once the upload is completed.
For general streaming of output:
To add a stream to a template you need to pass the :stream option from within your controller like this: render stream: true. You also need to explicitly close the stream with response.stream.close. Since the method of rendering templates and layouts changes with streaming, it is important to pay attention to loading attributes like title, etc. This needs to be done with content_for not yield. You can explicitly open and close streams using the Live API. For this you need the puma gem. Also be aware that you need a web server that supports streaming. You can configure Unicorn to support streaming.
I have seen various posts where developers have opted for the chunking option to upload files, particularly large files.
It seems that if one uses the chunking option, the files are uploaded and progressively saved to disk, is this correct? if so it seems there needs to be a secondary operation to process the files.
If the config is set to allow large files, should plupload work without chunking up to the allowed file size for multiple files?
It seems that if one uses the chunking option, the files are uploaded
and progressively saved to disk, is this correct ?
If you mean "automatically saved to disk", as far as I know, it is not correct. Your MVC controller will have to handle as many requests as there are chunks, concatenate each chunk in a temp file, then rename the file after handling the last chunk.
It is handled this way in the upload.php example of plupload
if so it seems there needs to be a secondary operation to process the
files.
I'm not sure I understand this (perhaps you weren't meaning "automatically saved to disk")
If the config is set to allow large files, should plupload work
without chunking up to the allowed file size for multiple files ?
The answer is yes... and no.... It should work, then fail with some combination of browsers / plupload runtimes when size comes around 100 MB. People also seem to encounter problems to setup the config.
I handle small files (~15MB) and do not have to use chunking.
I would say that if you are to handle large files, chunking is the way to go.
I have large set of image arround 60 MB. I want to use these images offline, in a html5 web application. Currently I am storing image data in a sqlite table but it seems ipad safari does not support more than 50 MB of data.
Is there any way to store data in the folders at client side/ipad and then use them through javascript code or there is any alternative way to do that.
You can use the File System API, however the limitations may vary in different browsers and OSs.
https://developer.mozilla.org/en-US/docs/WebGuide/API/File_System/Introduction