My app is downloading large 90 MB video files from a server. Many customers in rural areas complain about inability to download them. Downloads always restart from scratch when the connection breaks for too long.
Is there a library that can download large files in very low and interrupted bandwidth conditions over the course of several days if necessary? Such that it resumes an unfinished download and keeps adding bit by bit until it is complete?
It has to be very, very robust.
NSURLSession supports this. When a download fails you can obtain an object that can be used to resume the download later. Read the doc for more infos.
You can also make use of a background session if you want to perform a long download in the background. See the doc and this question : NSURLSession background download - resume over network failure
Related
I want to use NSURLSession to upload photos to cloud server(like OneDrive) and I hope the app can do works in background.
But NSURLSession just support "fromFile" not "fromData" in background mode, and I don't know how to get the NSURL of photos, so I write the photo's data into a file...
I think this is not a good way.
From the comments above, it seems like you'll be uploading several photos at a time and for that a "fromData" approach does not seem like a good idea.
Firstly, high resolution photos have several megabytes in size. If your loading a 5mb photo into memory, using NSData, you're probably fine. But loading 10 photos would then mean you'd need 50mb of memory to hold them all.
Secondly, uploading large chunks of data, such as photos, is also a computationally intensive task. You should most definitely not try to upload them all at the same time. This means you're going to have to build an upload queue that only starts uploading the next photo once the last one has finished. What happens then if internet connectivity is lost midway through the process? And what happens if the user closes the app all of the sudden?
Thus, I'd suggest you grab all the images' data and write them to a temporary folder somewhere. Then you start the upload process, which would find a file on the temporary folder, upload it, delete it once the upload finishes successfully and start over with the next file.
This way you're minimising your app's memory footprint, making it more resilient in the face of the many adversities an app may face (no internet connection, being prematurely closed by the user and so on) and also abiding by the APIs that are provided to you.
I am currently using NSURLConnection to download and upload data in my app.
Files are loaded directly to a cache.
When a user scrolls over a collection view I start downloading the relevant images (similar to pinterest homescreen).
I already have a system in place that downloads 'last requested' first, but finished the downloads that are currently running first of course.
Since I'm going to integrate download of larger files, like video or larger images than thumbnails for detail views, I was wondering whether I could limit the download speed of a connection to prioritise later requested 'large' files.
I've so far looked into ASIHTTPRequest, but they only allow throttling the entire connection.
I would like to avoid implementing everything on my own using CFStreams. Ant ideas?
AFNetworking does not offer this feature, since it builds on NSURLConnection and most other frameworks I've looked at don't offer this feature.
I have an iOS app in which I need to download multiple audio files before a player can start. (All the files need to be downloaded first because they all play simultaneously as a multi-track song.)
I know about the advantages of downloading asynchronously from the main thread (not blocking the UI, etc.) but I'm wondering if there's any advantage to downloading each of the files asynchronously from each other, vs. all on the same background thread. Which approach would download all the files fastest, if there's a difference?
It's really a matter of network bandwidth. Most likely if you tried to download 10 files at the same time, it will take as long (and possibly longer) than downloading the same 10 files one at a time.
A user's Internet connection only allows so much data per second. Assuming that is maxed out for each download, downloading more than one file at a time means that the max throughput has to be split between the files.
Your best option is to setup a concurrent operation queue. Queue each download as a separate operation. Then experiment by setting up the operation queue to support anywhere from 1 to n concurrent operations. Do the tests multiple times at different times and track how long it takes to complete all of the downloads. See which results in the best overall average. Keep in mind the results could be different for a user on a slow 2G cellular connection vs. someone on a super fast home Wi-Fi connection.
Here is the point: I've got an app that runs with a set of images in very low resolution. Nothing special here.
When the app starts, I'd like it starts the synch download of full resolution photos set process (photos are downloaded off a remote server).
As the file is typically large (>1Gb), I need a way to resume from a potential stop (whatever the reasons), so that it eventually have 100% of photo set on the device without having to start over from the start.
How would you do that?
Your question does not tells about the data in details. There are 2 cases possible :
You are downloading all the images as a single file(like .zip) or in a single request : In this case you can use NSURLSession which has the pause and resume facility inbuilt.
You are downloading the images one by one : In this case you need to have track of the images with some id or index to start the download from a particular file in a sequential manner. For each file pause and resume facility you can use NSURLSession.
Scenario:
As a user I am able to take (an unlimited amount of) photos and videos which are stored in the apps documents folder. Each of these media files gets a record within a Sqlite database with additional information (for exeample a caption). All this is possible to do completely offline.
Back online I get a dialog with a list of all the videos and photos I took and a button which starts an upload process.
Each file is uploaded after the other together with its metadata by making a multipart POST request to the server. The response of the server is stored together with the metadata in the Sqlite database (so there is no fire and forget).
Reliable solutions?
If I am reading and understanding this chart correctly, the most simple solution would be to wrap each of these uploads within a Task. Side effect: after 10 minutes every task would be cancelled, which becomes a problem by having a slow connection or very large files (for example a very long video).
The recommended way would be to use NSUrlSession/Background transfer service.
Which leads me to my question:
Is it possible to wrap multipart POSTs in NSURLSessionDataTasks and would this be reliable, even if the task is running longer than 10 minutes or the user is suspending the app?
As I am a Xamarin/C# guy I would really appreciate some sample snippets for a working multipart upload, even if it's in Objective-C ;-).
Almost and... yes.
Background Transfer service works with NSUrlSessionDownloadTasks and NSUrlSessionUploadTasks only. Not NSUrlSessionDataTasks, as described here.
Other than this "basic" limitation, it's safe to use background transfer service with upload tasks.
The 10-minute-freepass-in-the-background no longer applies on iOS 7 (basically, it's there, but different), however, with NSURLSession and background transfer service you do not need it.
I've a blog post here for background transfer service, based on download tasks.
An important thing to note is that, starting a task basically means that it will actually start sometime and actually finish some other time. This depends on whether the device is on cellular or Wi-Fi and other factors which are (probably) only known to iOS (and Apple).