Limit to number of simultaneous NSURLConnection Requests in iOS 5? - ios

In iOS 5 only, an application I'm working on seems to drop requests when I send a large number of requests asynchronously. Large meaning still fairly small - testing with 30 right now
Each request uses it's own NSURLConnection and is expected to return relatively quickly ( ie. 300 ms - 2s ). We also keep one connection open for a long period of time.
It seems like every 5th connection fails to leave the device. It certainly never reaches the server, and network debugging using Charles doesn't even show the requests going out.
I'm wondering if anyone knows of a limit to the number of simultaneous open NSURLConnection objects and active requests?
It's worth noting that we do not see this issue in iOS 4. It also seems that if we kill our long-lived connection, then we don't drop every 5th request any more.

NSOperationQueue is the best way to manage this problem. In a nutshell, you wrap your NSURLConnection in an NSBlockOperation, then add it to the queue. The queue allows you to set various properties such as the maximum number of simultaneous connections, and also give you an easy way to cancel queued operations.
There is a good intro to this design pattern in a WWDC video (2012) called "Building Concurrent User Interfaces on iOS".
In iOS 5 you can use the following call to start an NSURLConnection and add it to an NSOperationQueue
+ (void)sendAsynchronousRequest:(NSURLRequest *)request queue:(NSOperationQueue *)queue completionHandler:(void (^)(NSURLResponse*, NSData*, NSError*))handler

Related

Async NSURLConnection triggering other Async NSURLConnection: what is the best way of doing this?

this is an open question aiming at understanding what is the best practice or the most common solution for a problem that I think might be common.
Let's say I have a list of URLs to download; the list is itself hosted on a server, so I start a NSURLConnection that downloads it. The code in connectionDidFinishLoading will use the list of URLs to instantiate one new NSURLConnection, asynchronously, per each URL; this in turn will trigger even more NSURLConnections, and so on - until there are no more URLs. See it as a tree of connections.
What is the best way to detect when all connections have finished?
I'm aiming the question to iOS7, but comments about other versions are welcome.
A couple of thoughts:
In terms of triggering the subsequent downloads after you retrieve the list from the server, just put the logic to perform those subsequent downloads inside the completion handler block (or completion delegate method) of the first request.
In terms of downloading a bunch of files, if targeting iOS 7 and later, you might consider using NSURLSession instead of NSURLConnection.
First, the downloading of files with a nice modest memory footprint is enabled by initiating "download" tasks (rather than "data" tasks).
Second, you can do the downloads using a background NSURLSessionConfiguration, which will let the downloads continue even if the user leaves the app. See the Downloading Content in the Background section of the App Programming Guide for iOS. There are a lot of i's that need dotting and t's that need crossing if you do this, but it's a great feature to consider implementing.
See WWDC 2013 What's New in Foundation Networking for an introduction to NSURLSession. Or see the relevent chapter of the URL Loading System Programming Guide.
In terms of keeping track of whether you're done, as Wain suggests, you can just keep track of the number of requests issued and the number of requests completed/failed, and in your "task completion" logic, just compare these two numbers, and initiate the "all done" logic if the number of completions matches the number of requests. There are a bunch of ways of doing this, somewhat dependent upon the details of your implementation, but hopefully this illustrates the basic idea.
Instead of using GCD you should consider using NSOperationQueue. You should also limit the number of concurrent operations, certainly on mobile devices, to perhaps 4 so you don't flood the network with requests.
Now, the number of operations on the queue is the remaining count. You can add a block to the end of each operation to check the queue count and execute any completion logic.
As Rob says in his answer you might want to consider NSURLSession rather than doing this yourself. It has a number of advantages.
Other options are building your own download manager class, or using a ready-made third party framework like AFNetworking. I've only worked with AFNetworking a little bit but from what I've seen its elegant, powerful, and easy to use.
Our company wrote an async download manager class based on NSURLConnection for a project that predates both AFNetworking and NSURLSession. It's not that hard, but it isn't as flexible as either NSURLSession or AFNetworking.

Fastest way to send several HTTP Post request for iOS?

I need to send around 20 HTTP Post requests in my iOS Application. Right now I am using NSURLConnection and sending the 20 requests one by one, which of course takes a long time. Each connection starts after the previous, taken around 7 seconds to complete all the requests. Is it possible to send these 20 requests simultaneously and receive the JSON result much faster?
You can use NSOperation and NSOperationQueue to prepare all of the requests and push them onto the queue at the same time. Then you can set the concurrent execution limit to determine how many run at the same time. Don't run all 20 at the same time though as you may flood the network and prevent any of the connections from completing properly. Try running 5 concurrently and see how it goes.

NSURLConnection and multiple asynchronous requests - is it messing with the data being transmitted?

I have an NSArray of links. I want to parse through them with an online article extractor API (Clear Read), and with the result given back for each article (some HTML) I throw it into an NSString.
My problem arises from the fact that, say my array has 100 URLs in it, I loop through the array shooting each item into the API and getting back some results in JSON. This is firing like 100 NSURLConnection calls at once asynchronously.
I wasn't sure if that'd be a problem, but when I give it 100 URLs (real strings, none are nil) the data that comes back often has either empty values for the JSON keys (when they shouldn't), or the data coming back is nil. There's also a bunch of duplicates.
Should I be handling multiple asynchronous connections better than I am now? If so, how?
A couple of thoughts:
If you're doing concurrent asynchronous requests and are using asynchronous NSURLConnection, then you'll want to define your own class for this download operation to make sure that every connection keeps track of its own properties. That way, everything can be encapsulated within this class where the resulting download objects can keep track of what's downloaded, what's been parsed, etc. If you're not using asynchronous NSURLConnection (e.g. you're just using dataWithContentsOfURL), it's even easier, though you lose some of the progress updates that NSURLConnection provides and/or streaming opportunities.
For best performance, you should do concurrent requests. Having said that, you should not have more than four or five concurrent requests going to any particular server. This is an iOS imposed constraint, and especially if you have a slow network connection, you risk having connections timeout otherwise.
If you're doing preliminary testing on the simulator, you may want to make sure you try out the "network link conditioner". It's part of the "Hardware IO Tools for Xcode", available at the Downloads for Apple Developers. There are issues (such as the aforementioned timeout problems if you have too many concurrent requests going to a particular server) that only manifest themselves in slow connections.
Having said that, you also want to make sure to test your solution on a device with real world network speeds. It's easy to successfully run massively parallel tasks successfully on the simulator that are too greedy for the device. Limiting the number of concurrent sessions to five will diminish this resource problem, but it should be part of your testing strategy.
I agree with JRG-Developer, that you should look into established frameworks, such as AFNetworking. Make sure to set the maxConcurrentOperationCount for the queue of the AFHTTPClient, though, if queueing 100 plus operations.
I don't know how much data your 100 requests entail, but be forewarned that the app approval process has been known to reject apps that make extraordinary networks requests on cellular networks. What constitutes excessive cellular network activity is not explicitly stated in the app review guidelines, though Avoiding iPhone App Rejection From Apple has claimed that you should ensure that you don't exceed more than 4.5mb in 5 minutes. You can use Reachability to determine what type of network you are on and perhaps warn the user if they're on cellular (if the amount of data approaches this threshold).
Have you considered using a third party framework - such as AFNetworking - and limiting the number of asynchronous calls happening at once? Perhaps this might help / solve your problem.
In particular, you might consider creating a networking manager class that creates and manages AFHTTPClient(s), which in turn manages AFHTTPRequestOperations, for each endpoint (base URL) you hit.

How many simultaneous downloads make sense on iOS

I have an iOS app which synchronizes a certain number of assets at startup. I'm using AFNetworking and set up an NSOperationQueue to handle all of the downloads. I was wondering, how many simultaneous downloads make sense. Is there a limit where network performance will drop if I have to many at the same time? At the moment I'm doing max 5 downloads at a time.
This depends on several factors:
What is the network speed and latency?
What is the data size of the requests and responses?
How long does processing a request take on the server?
How long does processing a response take on the client?
How many parallel requests can the server fulfill efficiently?
How many users will make requests at the same time?
What is the minimal speed and memory size of the target device?
For small and medium sized applications, the limiting factor is usually the device's network latency, but that might not be the case in your situation. In the end, you'll have to test and figure out the most efficient compromise. 5 is a good number to start with.
You might want to set the number of concurrent downloads by the available network connection (WLAN or 3G or even slower...).
The beauty of using NSOperationQueues is that they are closely tied into the underlying OS (iOS or OSX). The queue decides how many operations to run based on many factors, including free memory, load on the system, etc.
You should not try to second guess the system and throttle yourself. Queue as many operations as you have and let the OS deal with it. I have an iPhone app that adds hundreds of operations in the queue when it has to fetch images of varying sizes etc. Works great, UI is not blocked, etc.
EDIT: well, it seems that when doing NSURLConnections and similar network connections, NSOperationQueue is NOT really keyed in to network usage. I asked on the Apple internal forums this summer, and in the end was told by Quinn "The Eskimo" (Apple network guru) to use a limit of something like 4. So this post is correct in the sense of pure processing power - NSOperationQueue will do the right thing - but when it comes to network ops you need to set a limit.
Depends on your hardware mostly I would say. Best way to address this is to test it with multiple cases with multiple trials. Try to diversify the hardware you test on as much as possible (remember do not use the simulator to test this!).
There actually is a constant the SDK provides that varies depending on various constraints. I would recommend you look into using it.
Regarding this question, I've done some tests on a Ipad2 IOS6.0. I've created a little app that performs an HTTP-GET request to a webserver. This webserver will provide data for 60 seconds ( this to get a meaningfull result, will change this to 10 min later in my tests ).
For one HTTP-GET request it works very good. Then I tried to perform several HTTP-request at the same time and see how many and how fast I can download over a WIFI connection of the IPad
I made 2 versions. 1 version using NSOperations and 1 version using NSThread an Synchron HTTP-GET request. In short, I always get a TimeOut for my 6th request. ( The tcp-syn doesn't get to my HTTP-Server ).
Extra info:
NSThead-implementation:
Simply make a for loop and create a Thread. This will perform a synchronized HTTP requests.
There I observe that my 6th request times out after 20 seconds. If I set the Timeout to 80 seconds, I clearly see that after the end of my first http-request ( after 60 seconds ) my 6th request is launched...
NSOperation-implementation:
Create a Queue and set the maxConcurrentOperations to 12. Add 12 http-request Operations to the queue. Here as well I notice that the 6th request gets a -1001 error code ( meaning: timout ). and I see no tcp-syn of the 6th request.

NSURLConnection getting limited to a Single Connection at a time?

OK - let's rephrase this whole question shall we?
Is there any way to tell if iOS is holding onto an NSURLConnection after it has finished & returned it's data?
I've got 2 NSURLConnections I'm instantiating & calling into a server with. The first one initiates the connection with the server and then goes into a COMET style long-polling wait while another user interacts with the request. The second one goes into the server and triggers a cancel mechanism which safely ends the first request and causes both to return successfully with a "Cancelled by you" message.
In the happy path case the Cancel button will never be clicked. But it's possible to click it and exit the current action.
This whole scenario works GREAT once. And then never works again (until the app is reset).
It's as though the first time thru one of the connections is never released and we are from then on limited to only a single connection because one of them is locked.
BTW I've tried NSURLConnection, AFNetwork, MKNetworkKit, ASIHTTPRequest - no luck what-so-ever with any other frameworks. NSURLConnection should do what I want. It's just ... not letting go of one of my connections.
I suspect the cancellation request in Step 2 is leaving the HTTP connection open.
I don't know exactly how the NS* classes work with respect to the HTTP/1.1 recommendation of at most two simultaneous connections, but let's assume they're enforcing at most two connections. Let's suppose the triggering code in Instance A (steps 1 and 3 of your example) cleans up after itself, but the cancellation code in Instance B (steps 2 and 4) leaves the connection open. That might explain what you are observing.
If I were you, I'd compare the code that runs in step 1 against the code that runs in step 2. I bet there's a difference between them in terms of the way they clean up after themselves.
If I'm not wrong,
iOS/Mac holds on to a NSURLConnection as long as the "Keep-Alive" header dictates it to.
But as a iOS developer you shouldn't be worried. any reason why you would like to know that?
So unfortunately with the lack of a real solution to this issue being found in all my testing I've had to implement simple polling to resolve the issue.
I've also had to implement iOS only APIs on the server.
What this comes down to is an API to send up a command and put it into a queue on the server, then using an NSTimer on the client to check the status of the of the queued item on a regular interval.
Until I can find out how to make multiple connections on iOS with long-polling this is the only working solution. Once I have a decent amount of points I'll gladly bounty them away for a solution to this :(

Resources