I have an iOS app in which I need to download multiple audio files before a player can start. (All the files need to be downloaded first because they all play simultaneously as a multi-track song.)
I know about the advantages of downloading asynchronously from the main thread (not blocking the UI, etc.) but I'm wondering if there's any advantage to downloading each of the files asynchronously from each other, vs. all on the same background thread. Which approach would download all the files fastest, if there's a difference?
It's really a matter of network bandwidth. Most likely if you tried to download 10 files at the same time, it will take as long (and possibly longer) than downloading the same 10 files one at a time.
A user's Internet connection only allows so much data per second. Assuming that is maxed out for each download, downloading more than one file at a time means that the max throughput has to be split between the files.
Your best option is to setup a concurrent operation queue. Queue each download as a separate operation. Then experiment by setting up the operation queue to support anywhere from 1 to n concurrent operations. Do the tests multiple times at different times and track how long it takes to complete all of the downloads. See which results in the best overall average. Keep in mind the results could be different for a user on a slow 2G cellular connection vs. someone on a super fast home Wi-Fi connection.
Related
Currently what I want to achieve is download files from an array that download only one file at a time and it still performs download even the app goes to the background state.
I'm using Rob code as stated in here but he's using URLSessionConfiguration.default which I want to use URLSessionConfiguration.background(withIdentifier: "uniqueID") instead.
It did work in the first try but after It goes to background everything became chaos. operation starts to download more than one file at a time and not in order anymore.
Is there any solution to this or what should I use instead to achieve what I want. If in android we have service to handle that easily.
The whole idea of wrapping requests in operation is only applicable if the app is active/running. It’s great for things like constraining the degree of concurrency for foreground requests, managing dependencies, etc.
For background session that continues to proceed after the app has been suspended, though, none of that is relevant. You create your request, hand it to the background session to manage, and monitor the delegate methods called for your background session. No operations needed/desired. Remember, these requests will be handled by the background session daemon even if your app is suspended (or if it terminated in the course of its normal lifecycle, though not if you force quit it). So the whole idea of operations, operation queues, etc., just doesn’t make sense if the background URLSession daemon is handling the requests and your app isn’t active.
See https://stackoverflow.com/a/44140059/1271826 for example of background session.
By the way, true background sessions are really useful when download very large resources that might take a very long time. But it introduces all sorts of complexities (e.g., you often want to debug and diagnose when not connected to the Xcode debugger which changes your app lifecycle, so you have to resort to mechanisms like unified messaging; you need to figure out how to restore UI if the app was terminated between the time the requests were initiated and when they finished; etc.).
Because of this complexity, you might want to consider whether this is absolutely needed. Sometimes, if you only need less than 30 seconds to complete some requests, it’s easier to just ask the OS to keep your app running in the background for a little bit after the user leaves the app and just use standard URLSession. For more information, see Extending Your App's Background Execution Time. It’s a much easier solution, bypassing many background URLSession hassles. But it only works if you only need 30 seconds or less. For larger requests that might exceed this small window, a true background URLSession is needed.
Below, you asked:
There are some downside with [downloading multiple files in parallel] as I understanding.
No, it’s always better to allow downloads to progress asynchronously and in parallel. It’s much faster and is more efficient. The only time you want to do requests consecutively, one after another, is where you need the parse the response of one request in order to prepare the next request. But that is not the case here.
The exception here is with the default, foreground URLSession. In that case you have to worry about latter requests timing out waiting for earlier requests. In that scenario you might bump up the timeout interval. Or we might wrap our requests in Operation subclass, allowing us to constrain not only how many concurrent requests we will allow, but not start subsequent requests until earlier ones finish. But even in that case, we don’t usually do it serially, but rather use a maxConcurrentOperationCount of 4 or something like that.
But for background sessions, requests don’t time out just because the background daemon hasn’t gotten around to them yet. Just add your requests to the background URLSession and let the OS handle this for you. You definitely don’t want to download images one at a time, with the background daemon relaunching your app in the background when one download is done so you can initiate the next one. That would be very inefficient (both in terms of the user’s battery as well as speed).
You need to loop inside an array of files and then add to the session to make it download but It will be download asynchronously so it's hard to keeping track also since the files are a lot.
Sure, you can’t do a naive “add to the end of array” if the requests are running in parallel, because you’re not guaranteed the order that they will complete. But it’s not hard to capture these responses as they come in. Just use a dictionary for example, perhaps keyed by the URL of the original request. Then you can easily look up in that dictionary to find the response associated with a particular request URL.
It’s incredibly simple. And we now can perform requests in parallel, which is much faster and more efficient.
You go on to say:
[Downloading in parallel] could lead the battery to be high consumption with a lot of requests at the same time. that's why I tried to make it download each file one at a time.
No, you never need to perform downloads one at a time for the sake of power. If anything, downloading one at a time is slower, and will take more power.
Unrelated, if you’re downloading 800+ files, you might want to allow the user to not perform these requests when the user is in “low data mode”. In iOS 13, for example, you might set allowsExpensiveNetworkAccess and allowsConstrainedNetworkAccess.
Regardless (and especially if you are supporting older iOS versions), you might also want to consider the appropriate settings isDiscretionary and allowsCellularAccess.
Bottom line, you want to make sure that you are respectful of a user’s limited cellular data plan or if they’re on some expensive service (e.g. connecting on an airplane’s expensive data plan or tethered via some local hotspot).
For more information on these considerations, see WWDC 2019 Advances in Networking, Part 1.
My app is downloading large 90 MB video files from a server. Many customers in rural areas complain about inability to download them. Downloads always restart from scratch when the connection breaks for too long.
Is there a library that can download large files in very low and interrupted bandwidth conditions over the course of several days if necessary? Such that it resumes an unfinished download and keeps adding bit by bit until it is complete?
It has to be very, very robust.
NSURLSession supports this. When a download fails you can obtain an object that can be used to resume the download later. Read the doc for more infos.
You can also make use of a background session if you want to perform a long download in the background. See the doc and this question : NSURLSession background download - resume over network failure
I am using Alamofire as my networking library for my Swift app. Is there a way to keep a "priority queue" of network requests with Alamofire? I believe I saw this feature in a library in the past but I can no longer find it or find other posts about this.
Let's say I open a page in my application and it starts to make a few requests. First it gets some JSON, which is fast and no problem.
From that JSON, it pulls out some information and then starts downloading images. These images have the potential to be quite large and take many seconds (~30 seconds or more sometimes). But the tricky part is that the user has the option to move on to the next page before the image(s) finish downloading.
If the user moves on to the next page before the image downloading is done, is it possible to move it on to a lower priority queue? So that when the images on the next page start loading they will go faster? I would even be open to pausing the old one entirely until the new requests are finished if that is even possible.
Keep in mind I am open to many suggestions. I have a lot of freedom with my implementation. So if this is a different library, or different mechanism in iOS that is fine. Even if I continue to use Alamofire for JSON and do all my image downloading and management with something else that would be alright too.
Also, probably irrelevant but I will add it here. I'm using https://github.com/rs/SDWebImage for caching my images once they're fully downloaded. Which is why I don't want to cancel the request completely. I need it to finish and then it won't happen again.
TL;DR I want a fast queue and a slow queue with the ability to move things from the fast queue to the slow queue before they are finished.
Have you considered managing a NSOperationQueue? This tutorial might be helpful. In his example, he pauses the downloads as they scroll off the page, but I believe you could adjust the queuePriority property of the NSOperation objects instead.
I have an iOS app which synchronizes a certain number of assets at startup. I'm using AFNetworking and set up an NSOperationQueue to handle all of the downloads. I was wondering, how many simultaneous downloads make sense. Is there a limit where network performance will drop if I have to many at the same time? At the moment I'm doing max 5 downloads at a time.
This depends on several factors:
What is the network speed and latency?
What is the data size of the requests and responses?
How long does processing a request take on the server?
How long does processing a response take on the client?
How many parallel requests can the server fulfill efficiently?
How many users will make requests at the same time?
What is the minimal speed and memory size of the target device?
For small and medium sized applications, the limiting factor is usually the device's network latency, but that might not be the case in your situation. In the end, you'll have to test and figure out the most efficient compromise. 5 is a good number to start with.
You might want to set the number of concurrent downloads by the available network connection (WLAN or 3G or even slower...).
The beauty of using NSOperationQueues is that they are closely tied into the underlying OS (iOS or OSX). The queue decides how many operations to run based on many factors, including free memory, load on the system, etc.
You should not try to second guess the system and throttle yourself. Queue as many operations as you have and let the OS deal with it. I have an iPhone app that adds hundreds of operations in the queue when it has to fetch images of varying sizes etc. Works great, UI is not blocked, etc.
EDIT: well, it seems that when doing NSURLConnections and similar network connections, NSOperationQueue is NOT really keyed in to network usage. I asked on the Apple internal forums this summer, and in the end was told by Quinn "The Eskimo" (Apple network guru) to use a limit of something like 4. So this post is correct in the sense of pure processing power - NSOperationQueue will do the right thing - but when it comes to network ops you need to set a limit.
Depends on your hardware mostly I would say. Best way to address this is to test it with multiple cases with multiple trials. Try to diversify the hardware you test on as much as possible (remember do not use the simulator to test this!).
There actually is a constant the SDK provides that varies depending on various constraints. I would recommend you look into using it.
Regarding this question, I've done some tests on a Ipad2 IOS6.0. I've created a little app that performs an HTTP-GET request to a webserver. This webserver will provide data for 60 seconds ( this to get a meaningfull result, will change this to 10 min later in my tests ).
For one HTTP-GET request it works very good. Then I tried to perform several HTTP-request at the same time and see how many and how fast I can download over a WIFI connection of the IPad
I made 2 versions. 1 version using NSOperations and 1 version using NSThread an Synchron HTTP-GET request. In short, I always get a TimeOut for my 6th request. ( The tcp-syn doesn't get to my HTTP-Server ).
Extra info:
NSThead-implementation:
Simply make a for loop and create a Thread. This will perform a synchronized HTTP requests.
There I observe that my 6th request times out after 20 seconds. If I set the Timeout to 80 seconds, I clearly see that after the end of my first http-request ( after 60 seconds ) my 6th request is launched...
NSOperation-implementation:
Create a Queue and set the maxConcurrentOperations to 12. Add 12 http-request Operations to the queue. Here as well I notice that the 6th request gets a -1001 error code ( meaning: timout ). and I see no tcp-syn of the 6th request.
I have created a project on GitHub so I can learn how to optimize networking for my iOS apps. I have used blocks and GCD heavily and after watching WWDC 2012 videos and videos from past years I learned that I may be able to do more with NSOperationQueue. Specifically I can control the number of concurrent operations (network connections) as well as provide cancellation of operations. I am experimenting with allowing 1, 2, 4, 8 and 16 concurrent operations and I am seeing interesting results that I did not totally expect. I am measuring the results but I wonder if there is more that I should be measuring.
You can find sample project here:
https://github.com/brennanMKE/OptimizedNetworking
Since I am using the async API of NSURLConnection there is plenty of benefit to having many concurrent connections because the API spends a fair amount of time waiting for HTTP packets. Previously my code would start with an array of items to download and request them all sequentially, which is prevents the benefits of concurrency. I have also been using notifications to cancel network connections. Now I can do that with this project through operations and I have set them up to use a value for priority and a category so that I can prioritize and sort downloads and cancel a category of operations. I may choose to use a category for each view and when a user leaves a view all operations for that view will be cancelled using the category. This will free up resources for the active view.
One concern with using more concurrent operations is CPU usage as well as I/O, but I am not aware of a way to measure these values with iOS. The equivalent of the "w" command in iOS to show CPU usage could be useful. I am less concerned about I/O but measuring it would be more comprehensive.
My main issue with how I was doing networking was a responsive UI. I found that what I have been doing has made the UI sluggish. This new approach may help a great deal, but only if I keep the number of concurrent operations down. The optimal number of operations may vary by the type of connection (3G, WiFi, etc) so checking the connection type may lead to some optimizations.
If you are interested in better ways to speed up network communications in your app please try out this sample project and suggest other ways that I can measure performance and offer ways to further optimize communications. (Also note that I am referencing the Apple sample project MVCNetworking as well as the ASIHTTPRequest project.
What I may do next is to total up the amount of data downloaded and keep a log of that amount along with the total time to complete the download.
The README file should help explain the project and how it works.
If this helps Mugunth Kumar actually checks the type of connection using the reachability class before setting the NSOperationQueue max connection size in the MKNetworkKit