Create a copy of NSOperation in the failure block of same operation - afnetworking

Is there a way to start an identical NSOperation when the currently executing NSOperation finishes executing.
I am trying to download a set of files using AFDownloadRequestOperation (a sub-class of AFHTTPRequestOperation) with maxConcurrency set as 1. If a download fails due to intermittent availability of the server, the code reaches failure block. I would then want to add a copy of this current operation to the operation queue.

Yes, you can copy it by doing:
AFHTTPRequestOperation *newOperation = [oldOperation copy];
Note these caveats from the documentation:
-copy and -copyWithZone: return a new operation with the NSURLRequest of the original. So rather than an exact copy of the
operation at that particular instant, the copying mechanism returns a
completely new instance, which can be useful for retrying operations.
A copy of an operation will not include the outputStream of the original.
Operation copies do not include completionBlock, as it often strongly captures a reference to self, which would otherwise have
the unintuitive side-effect of pointing to the original operation
when copied.
The most important thing to note is that you'll need to reset your completion blocks after you make a copy.

Related

Understanding NSBlockOperation

I'm getting into NSBlockOperation and I have some questions.
Notably, the documentation for addExecutionBlock says:
Discussion
The specified block should not make any assumptions about
its execution environment.
Calling this method while the receiver is executing or has already
finished causes an NSInvalidArgumentException exception to be thrown.
What kind of situation will throw NSInvalidArgumentException? What really doesn "while receiver is executing" mean? What can cause this?
You can't use addExecutionBlock: to add an execution block while the operation is running or has already completed. That's all it means.
A block operation object can have zero or more execution blocks associated with it. When the block operation is started, all of its associated execution blocks are submitted for concurrent execution. The warning is that you can't add more execution blocks to the operation after this point.
You can create more block operation objects and add execution blocks to those. Each block operation is started separately from others, so the rule about adding more execution blocks is evaluated separately.
Typically, you would create a block operation, add whatever execution blocks to it that you want, and then queue the operation onto an operation queue. Once the operation has been queued, it might start at any time (subject to readiness, which is subject to dependencies). So, it's best to not attempt to add execution blocks once it's been queued.

objective-c, possible to queue async NSURLRequests?

I realize this question sounds contradictory. I have several Async requests going out in an application. The situation is that the first async request is an authentication request, and the rest will use an access token returned by the successful authentication request.
The two obvious solutions would be:
run them all synchronous, and risk UI block. (bad choice)
run them async, and put request 2-N in the completion handler for the first one. (not practical)
The trouble is that the subsequent requests may be handled anywhere in the project, at anytime. The failure case would be if the 2nd request was called immediately after the 1st authentication request was issued, and before the access token was returned.
My question thus is, is there any way to queue up Async requests, or somehow say not to issue them until the first request returns successfully?
EDIT:
Why (2) is not practical: The first is an authentication request, happening when the app loads. The 2nd+ may occur right away, in which case it is practical, but it also may occur in a completely separate class or any other part of a large application. I can't essentially put the entire application in the completion handler. Other accesses to the API requests may occur in other classes, and at anytime. Even 1-2 days away after many other things have occurred.
SOLUTION:
//pseudo code using semaphore lock on authentication call to block all other calls until it is received
// at start of auth
_semaphore = dispatch_semaphore_create(0)
// at start of api calls
if(_accessToken == nil && ![_apiCall isEqualToString:#"auth]){
dispatch_semaphore_wait(_semaphore, DISPATCH_TIME_FOREVER);
}
// at end of auth with auth token
dispatch_semaphore_signal([[SFApi Instance] semaphore]);
_accessToken = ...;
This sounds like a case where you'd want to use NSOperation's dependencies
From apple docs:
Operation Dependencies
Dependencies are a convenient way to execute operations in a specific order. You can add and remove dependencies for an operation using the addDependency: and removeDependency: methods. By default, an operation object that has dependencies is not considered ready until all of its dependent operation objects have finished executing. Once the last dependent operation finishes, however, the operation object becomes ready and able to execute.
note that in order for this to work, you must subclass NSOperation "properly" with respect to KVO-compliance
The NSOperation class is key-value coding (KVC) and key-value observing (KVO) compliant for several of its properties. As needed, you can observe these properties to control other parts of your application.
You can't really have it both ways-- there's no built-in serialization for the NSURLConnection stuff. However, you are probably already funneling all of your API requests through some common class anyway (presumably you're not making raw network calls willy-nilly all over the app).
You'll need to build the infrastructure inside that class that prevents the execution of the later requests until the first request has completed. This suggests some sort of serial dispatch queue that all requests (including the initial auth step) are funneled through. You could do this via dependent NSOperations, as is suggested elsewhere, but it doesn't need to be that explicit. Wrapping the requests in a common set of entry points will allow you to do this any way you want behind the scenes.
In cases like this I always find it easiest to write the code synchronously and get it running on the UI thread first, correctly, just for debugging. Then, move the operations to separate threads and make sure you handle concurrency.
In this case the perfect mechanism for concurrency is a semaphore; the authentication operation clears the semaphore when it is done, and all the other operations are blocking on it. Once authentication is done, floodgates are open.
The relevant functions are dispatch_semaphore_create() and dispatch_semaphore_wait() from the Grand Central Dispatch documentation: https://developer.apple.com/library/ios/documentation/Performance/Reference/GCD_libdispatch_Ref/Reference/reference.html#//apple_ref/doc/uid/TP40008079-CH2-SW2
Another excellent solution is to create a queue with a barrier:
A dispatch barrier allows you to create a synchronization point within a concurrent dispatch queue. When it encounters a barrier, a concurrent queue delays the execution of the barrier block (or any further blocks) until all blocks submitted before the barrier finish executing. At that point, the barrier block executes by itself. Upon completion, the queue resumes its normal execution behavior.
Looks like you got it running with a semaphore, nicely done!
Use blocks... 2 ways that I do it:
First, a block inside of a block...
[myCommKit getPlayerInfoWithCallback:^(ReturnCode returnCode, NSDictionary *playerInfo) {
if (playerInfo) {
// this won't run until the first one has finished
[myCommKit adjustSomething: thingToAdjust withCallback:^(ReturnCode returnCode, NSDictionary *successCode) {
if (successCode) {
// this won't run until both the first and then the second one finished
}
}];
}
}];
// don't be confused.. anything down here will run instantly!!!!
Second way is a method inside of a block
[myCommKit getPlayerInfoWithCallback:^(ReturnCode returnCode, NSDictionary *playerInfo) {
if (playerInfo) {
[self doNextThingAlsoUsingBlocks];
}
}];
Either way, any time I do async communication with my server I use blocks. You have to think differently when writing code that communicates with a server. You have to force things to go in the order you want and wait for the return success/fail before doing the next thing. And getting used to blocks is the right way to think about it. It could be 15 seconds between when you start the block and when it gets to the callback and executes the code inside. It could never come back if they're not online or there's a server outage.
Bonus way.. I've also sometimes done things using stages:
switch (serverCommunicationStage) {
case FIRST_STAGE:
{
serverCommunicationStage = FIRST_STAGE_WAITING;
// either have a block in here or call a method that has a block
[ block {
// in call back of this async call
serverCommunicationStage = SECOND_STAGE;
}];
break;
}
case FIRST_STAGE_WAITING:
{
// this just waits for the first step to complete
break;
}
case SECOND_STAGE:
{
// either have a block in here or call a method that has a block
break;
}
}
Then in your draw loop or somewhere keep calling this method. Or set up a timer to call it every 2 seconds or whatever makes sense for your application. Just make sure to manage the stages properly. You don't want to accidentally keep calling the request over and over. So make sure to set the stage to waiting before you enter the block for the server call.
I know this might seem like an older school method. But it works fine.

Is [NSOperationQueue mainQueue] guaranteed to be serial?

That is, if we queue the same thing several time there will be no concurrency.
The one we queued first will be executed first.
I mean there is only one main thread right?
I have found a nice answer here:
NSOperationQueue and concurrent vs non-concurrent
So make all added operations serial you can always set:
[[NSOperationQueue mainQueue] setMaxConcurrentOperationCount:1];
And the answer is... YES and NO
when you create a new NSOperation to add to your queue, you can use
- (void)setQueuePriority:(NSOperationQueuePriority)priority
according to the documentation, the queue will use this priority, and other factors as inter dependency to decide what operation will be executed next.
As long as your operations have the same priority and no inter-operation dependencies, they should be executed in the same order you added them, maybe with other, system related operations, inserted between them.
From documentation:
The NSOperationQueue class regulates the execution of a set of NSOperation objects. After being added to a queue, an operation remains in that queue until it is explicitly canceled or finishes executing its task. Operations within the queue (but not yet executing) are themselves organized according to priority levels and inter-operation object dependencies and are executed accordingly. An application may create multiple operation queues and submit operations to any of them.
Inter-operation dependencies provide an absolute execution order for operations, even if those operations are located in different operation queues. An operation object is not considered ready to execute until all of its dependent operations have finished executing. For operations that are ready to execute, the operation queue always executes the one with the highest priority relative to the other ready operations. For details on how to set priority levels and dependencies, see NSOperation Class Reference.
About threads:
Although you typically execute operations by adding them to an operation queue, doing so is not required. It is also possible to execute an operation object manually by calling its start method, but doing so does not guarantee that the operation runs concurrently with the rest of your code. The isConcurrent method of the NSOperation class tells you whether an operation runs synchronously or asynchronously with respect to the thread in which its start method was called. By default, this method returns NO, which means the operation runs synchronously in the calling thread.
When you submit a nonconcurrent operation to an operation queue, the queue itself creates a thread on which to run your operation. Thus, adding a nonconcurrent operation to an operation queue still results in the asynchronous execution of your operation object code.
So, if I understand correctly here will be no concurrency.

Clarifications needed for concurrent operations, NSOperationQueue and async APIs

This is a two part question. Hope someone could reply with a complete answer.
NSOperations are powerful objects. They can be of two different types: non-concurrent or concurrent.
The first type runs synchronously. You can take advantage of a non-concurrent operations by adding them into a NSOperationQueue. The latter creates a thread(s) for you. The result consists in running that operation in a concurrent manner. The only caveat regards the lifecycle of such an operation. When its main method finishes, then it is removed form the queue. This is can be a problem when you deal with async APIs.
Now, what about concurrent operations? From Apple doc
If you want to implement a concurrent operation—that is, one that runs
asynchronously with respect to the calling thread—you must write
additional code to start the operation asynchronously. For example,
you might spawn a separate thread, call an asynchronous system
function, or do anything else to ensure that the start method starts
the task and returns immediately and, in all likelihood, before the
task is finished.
This is quite almost clear to me. They run asynchronously. But you must take the appropriate actions to ensure that they do.
What it is not clear to me is the following. Doc says:
Note: In OS X v10.6, operation queues ignore the value returned by
isConcurrent and always call the start method of your operation from a
separate thread.
What it really means? What happens if I add a concurrent operation in a NSOperationQueue?
Then, in this post Concurrent Operations, concurrent operations are used to download some HTTP content by means of NSURLConnection (in its async form). Operations are concurrent and included in a specific queue.
UrlDownloaderOperation * operation = [UrlDownloaderOperation urlDownloaderWithUrlString:url];
[_queue addOperation:operation];
Since NSURLConnection requires a loop to run, the author shunt the start method in the main thread (so I suppose adding the operation to the queue it has spawn a different one). In this manner, the main run loop can invoke the delegate included in the operation.
- (void)start
{
if (![NSThread isMainThread])
{
[self performSelectorOnMainThread:#selector(start) withObject:nil waitUntilDone:NO];
return;
}
[self willChangeValueForKey:#"isExecuting"];
_isExecuting = YES;
[self didChangeValueForKey:#"isExecuting"];
NSURLRequest * request = [NSURLRequest requestWithURL:_url];
_connection = [[NSURLConnection alloc] initWithRequest:request
delegate:self];
if (_connection == nil)
[self finish];
}
- (BOOL)isConcurrent
{
return YES;
}
// delegate method here...
My question is the following. Is this thread safe? The run loop listens for sources but invoked methods are called in a background thread. Am I wrong?
Edit
I've completed some tests on my own based on the code provided by Dave Dribin (see 1). I've noticed, as you wrote, that callbacks of NSURLConnection are called in the main thread.
Ok, but now I'm still very confusing. I'll try to explain my doubts.
Why including within a concurrent operation an async pattern where its callback are called in the main thread? Shunting the start method to the main thread it allows to execute callbacks in the main thread, and what about queues and operations? Where do I take advantage of threading mechanisms provided by GCD?
Hope this is clear.
This is kind of a long answer, but the short version is that what you're doing is totally fine and thread safe since you've forced the important part of the operation to run on the main thread.
Your first question was, "What happens if I add a concurrent operation in a NSOperationQueue?" As of iOS 4, NSOperationQueue uses GCD behind the scenes. When your operation reaches the top of the queue, it gets submitted to GCD, which manages a pool of private threads that grows and shrinks dynamically as needed. GCD assigns one of these threads to run the start method of your operation, and guarantees this thread will never be the main thread.
When the start method finishes in a concurrent operation, nothing special happens (which is the point). The queue will allow your operation to run forever until you set isFinished to YES and do the proper KVO willChange/didChange calls, regardless of the calling thread. Typically you'd make a method called finish to do that, which it looks like you have.
All this is fine and well, but there are some caveats involved if you need to observe or manipulate the thread on which your operation is running. The important thing to remember is this: don't mess with threads managed by GCD. You can't guarantee they'll live past the current frame of execution, and you definitely can't guarantee that subsequent delegate calls (i.e., from NSURLConnection) will occur on the same thread. In fact, they probably won't.
In your code sample, you've shunted start off to the main thread so you don't need to worry much about background threads (GCD or otherwise). When you create an NSURLConnection it gets scheduled on the current run loop, and all of its delegate methods will get called on that run loop's thread, meaning that starting the connection on the main thread guarantees its delegate callbacks also happen on the main thread. In this sense it's "thread safe" because almost nothing is actually happening on a background thread besides the start of the operation itself, which may actually be an advantage because GCD can immediately reclaim the thread and use it for something else.
Let's imagine what would happen if you didn't force start to run on the main thread and just used the thread given to you by GCD. A run loop can potentially hang forever if its thread disappears, such as when it gets reclaimed by GCD into its private pool. There's some techniques floating around for keeping the thread alive (such as adding an empty NSPort), but they don't apply to threads created by GCD, only to threads you create yourself and can guarantee the lifetime of.
The danger here is that under light load you actually can get away with running a run loop on a GCD thread and think everything is fine. Once you start running many parallel operations, especially if you need to cancel them midflight, you'll start to see operations that never complete and never deallocate, leaking memory. If you wanted to be completely safe, you'd need to create your own dedicated NSThread and keep the run loop going forever.
In the real world, it's much easier to do what you're doing and just run the connection on the main thread. Managing the connection consumes very little CPU and in most cases won't interfere with your UI, so there's very little to gain by running the connection completely in the background. The main thread's run loop is always running and you don't need to mess with it.
It is possible, however, to run an NSURLConnection connection entirely in the background using the dedicated thread method described above. For an example, check out JXHTTP, in particular the classes JXOperation and JXURLConnectionOperation

In GCD, is there a way to tell if the current queue is concurrent or not?

In GCD, is there a way to tell if the current queue is concurrent or not?
I'm current attempting to perform a delayed save on some managed object contexts but I need to make sure that the queue the code is currently executed on is thread-safe (in a synchronous queue).
If you actually have to determine whether or not the queue passed in to you is serial or concurrent, you've almost certainly designed things incorrectly. Typically, an API will hide an internal queue as an implementation detail (in your case, your shared object contexts) and then enqueue operations against its internal queue in order to achieve thread safety. When your API takes a block and a queue as parameters, however, then the assumption is that the passed-in block can be safely scheduled (async) against the passed-queue (when, say, an operation is complete) and the rest of the code is factored appropriately.
Yes, assuming you're doing the work in an NSOperation subclass:
[myOperation isConcurrent] //or self, if you're actually in the NSOperation
If you need to ensure some operations are always executed synchronously, you can create a specific operation queue and set its maximum concurrent operations to 1.
NSOperationQueue * synchronousQueue = [[NSOperationQueue alloc] init];
[synchronousQueue setMaxConcurrentOperationCount:1];
GCD takes some planning ahead. The only other way I can think of is to observe the value isExecuting (or similar) on your NSOperation objects. Check out this reference on that. This solution would be more involved, so I hope the other one works for you.

Resources