How to resume downloads of large files in ios - ios

I am using downloadTask of URLSession to download a large file. The problem i am facing is how to provide pause and resume functionality. I have read that cancelling the downloadTask with resumeData gives back resumeData which can be used next time to resume the download. But for a really large file, this resumeData can be very large (i think. depending on file size and at what stage download is paused it can be very large). How do i persist this large resumeData so that i can use it next time to resume download.
Also there can be multiple downloads at the same time, which increases the same problem more.

The resume data blob does not contain the actual received data. If it did, you wouldn't be able to resume downloading a multi-gigabyte file on a 32-bit architecture.
What it contains is a bunch of metadata for the transfer:
A URL on disk where the temporary file is stored.
A time stamp indicating when the fetch began so it can ask the server if the file has changed since then.
The request URL.
The set of request headers.
Probably some UUIDs.
And it might just be the URL of a metadata file on disk that contains the things listed above. I'm not sure which.
Either way, you will not get a multi-gigabyte resumeData blob. That would be devastating. The whole point of a download task is that it downloads to disk, not to memory.

Related

suspend and resume alamofire download request - large file

I'm using Alamofire to download large files in my ios project.
When i cancel a download request (that is currently downloading a large file to disk) - this request produces resumeData with the data downloaded so far - but I would like a responseURL with the URL of the file with the partially downloaded data. I plan to use the responseURL serializer at the end to never load the entire data in memory - but if I want to suspend and resume downloads - then this is forcing me to load the data in memory.
there's a fileURL in the download request - but the documentation states that this is populated after the download has finished.
any pointers/suggestions would be appreciated?
As explained on GitHub, resumeData only includes the data necessary to resume a download, not the actual download data itself, so it's perfectly safe to keep it around in memory. You can parse the value to get the URL of the partially downloaded data, but it's not a formatted encoding, so it's not really appropriate for Alamofire to parse it directly.

Upload photos with NSURLSession in background

I want to use NSURLSession to upload photos to cloud server(like OneDrive) and I hope the app can do works in background.
But NSURLSession just support "fromFile" not "fromData" in background mode, and I don't know how to get the NSURL of photos, so I write the photo's data into a file...
I think this is not a good way.
From the comments above, it seems like you'll be uploading several photos at a time and for that a "fromData" approach does not seem like a good idea.
Firstly, high resolution photos have several megabytes in size. If your loading a 5mb photo into memory, using NSData, you're probably fine. But loading 10 photos would then mean you'd need 50mb of memory to hold them all.
Secondly, uploading large chunks of data, such as photos, is also a computationally intensive task. You should most definitely not try to upload them all at the same time. This means you're going to have to build an upload queue that only starts uploading the next photo once the last one has finished. What happens then if internet connectivity is lost midway through the process? And what happens if the user closes the app all of the sudden?
Thus, I'd suggest you grab all the images' data and write them to a temporary folder somewhere. Then you start the upload process, which would find a file on the temporary folder, upload it, delete it once the upload finishes successfully and start over with the next file.
This way you're minimising your app's memory footprint, making it more resilient in the face of the many adversities an app may face (no internet connection, being prematurely closed by the user and so on) and also abiding by the APIs that are provided to you.

How to download large files on slow, bad connectivity networks?

My app is downloading large 90 MB video files from a server. Many customers in rural areas complain about inability to download them. Downloads always restart from scratch when the connection breaks for too long.
Is there a library that can download large files in very low and interrupted bandwidth conditions over the course of several days if necessary? Such that it resumes an unfinished download and keeps adding bit by bit until it is complete?
It has to be very, very robust.
NSURLSession supports this. When a download fails you can obtain an object that can be used to resume the download later. Read the doc for more infos.
You can also make use of a background session if you want to perform a long download in the background. See the doc and this question : NSURLSession background download - resume over network failure

Download a large file (1+Gb) from iOS with resume capability

Here is the point: I've got an app that runs with a set of images in very low resolution. Nothing special here.
When the app starts, I'd like it starts the synch download of full resolution photos set process (photos are downloaded off a remote server).
As the file is typically large (>1Gb), I need a way to resume from a potential stop (whatever the reasons), so that it eventually have 100% of photo set on the device without having to start over from the start.
How would you do that?
Your question does not tells about the data in details. There are 2 cases possible :
You are downloading all the images as a single file(like .zip) or in a single request : In this case you can use NSURLSession which has the pause and resume facility inbuilt.
You are downloading the images one by one : In this case you need to have track of the images with some id or index to start the download from a particular file in a sequential manner. For each file pause and resume facility you can use NSURLSession.

Large File Uploads

Do large file uploads block an applications request/response cycle? I have an app that allows users to upload multiple large files (images in particular). These files are stored on a remote host. I cannot use async background jobs to upload these images as these have to be immediately accessible to the user once the upload finishes. How best should i handle such large uploads? Does it affect concurrency? This is my first time with uploads on a large scale. What should i be wary of other than the huge bills of course? Any input from developers who have created apps which use large file uploads will be greatly appreciated.
Why can't you use an async upload, and just handle the event that signifies that it's done? That's generally how async operations work - you kick them off and then store the pointer somewhere, and then either handle the "Complete" event, or just periodically iterate through he pointers for uploads you've started and check each one to see if it's complete.
It's an old question but still, I was worried about the same problem with large files uploads thinking that the processes get blocked while the file is getting uploaded but it turned out, if I got it right, that the nginx and probably other servers as well buffer the content of the file while its being sent so no rails processes get blocked, only when the upload is finished and rails is processing it, like resizing images or something.

Resources