Waiting for two NSOperation to finish without blocking UI thread - ios

I just read long introduction to NSOperationQueues and NSOperation here.
My question is the following. I need to run two operations is the same time. When both those tasks finished I need to make another calculations based on results from two finished operations. If one of the operations fails then whole operation should also fails. Those two operations does not have dependencies and are completely independent from each other so we can run them in parallel.
How to wait for this 2 operation to finish and then continue with calculations? I don't want to block UI Thread. Should I make another NSOperation that main method is creating two NSOperations add them to some local (for this operation) queue and wait with waitUntilAllOperationsAreFinished method. Then continue calculations?
I don't like in this approach that I need to create local queue every time I creating new operation. Can I design it that way that I can reuse one queue but wait only for two local operations? I can imagine that method waitUntilAllOperationsAreFinished can wait until all tasks are done so it will blocks when a lot of tasks will be performed in parallel. Any design advice? Is creating NSOperationQueue expensive? Is there any better ways to do it in iOS without using NSOperation & NSOperationQueue? I'm targeting iOS 9+ devices.

In Swift 4, you can do it this way:
let group = DispatchGroup()
// async
DispatchQueue.global().async {
// enter the group
group.enter()
taskA(onCompletion: { (_) in
// leave the group
group.leave()
})
group.enter()
taskB(onCompletion: { (_) in
group.leave()
})
}
group.notify(queue: DispatchQueue.main) {
// do something on task A & B completion
}
And there is an excellent tutorial on GCD from raywenderlich.com.

Related

Swift equivalent of Ruby’s Concurrent::Event?

The popular Concurrent-Ruby library has a Concurrent::Event class that I find wonderful. It very neatly encapsulates the idea of, “Some threads need to wait for another thread to finish something before proceeding.”
It only takes three lines of code to use:
One to create the object
One to call .wait to start waiting, and
One to call .set when the thing is ready.
All the locks and booleans you’d need to use to create this out of other concurrency primitives are taken care of for you.
To quote some of the documentation, along with with a sample usage:
Old school kernel-style event reminiscent of Win32 programming in C++.
When an Event is created it is in the unset state. Threads can choose to
#wait on the event, blocking until released by another thread. When one
thread wants to alert all blocking threads it calls the #set method which
will then wake up all listeners. Once an Event has been set it remains set.
New threads calling #wait will return immediately.
require 'concurrent-ruby'
event = Concurrent::Event.new
t1 = Thread.new do
puts "t1 is waiting"
event.wait
puts "event ocurred"
end
t2 = Thread.new do
puts "t2 calling set"
event.set
end
[t1, t2].each(&:join)
which prints output like the following
t1 is waiting
t2 calling set
event occurred
(Several different orders are possible because it is multithreaded, but ‘t2 calling set’ always comes out before ‘event occurred’.)
Is there something like this in Swift on iOS?
I think the closest thing to that is the new async/await syntax in Swift 5.5. There's no equivalent of event.set, but await waits for something asynchronous to finish. A particularly nice expression of concurrency is async let, which proceeds concurrently but then lets you pause to gather up all the results of the async let calls:
async let result1 = // do something asynchronous
async let result2 = // do something else asynchronous at the same time
// ... and keep going...
// now let's gather up the results
return await (result1, result2)
You can achieve the result in your example using a Grand Central Dispatch DispatchSemaphore - This is a traditional counting semaphore. Each call to signal increments the semaphore. Each call to wait decrement the semaphore and if the result is less than zero it blocks and waits until the semaphore is 0
let semaphore = DispatchSemaphore(value: 0)
let q1 = DispatchQueue(label:"q1", target: .global(qos: .utility))
let q2 = DispatchQueue(label:"q2", target: .global(qos: .utility))
q1.async {
print("q1 is waiting")
semaphore.wait()
print("event occurred")
}
q2.async {
print("q2 calling signal")
semaphore.signal()
}
Output:
q1 is waiting
q2 calling signal
event occurred
But this object won't work if you have multiple threads that want to wait. Since each call to wait decrements the semaphore the other tasks would remain blocked.
For that you could use a DispatchGroup. You call enter before you start a task in the group and leave when it is done. You can use wait to block until the group is empty, and like your Ruby object, wait will not block if the group is already empty and multiple threads can wait on the same group.
let group = DispatchGroup()
let q1 = DispatchQueue(label:"q1", target: .global(qos: .utility))
let q2 = DispatchQueue(label:"q2", target: .global(qos: .utility))
q1.async {
print("q1 is waiting")
group.wait()
print("event occurred")
}
group.enter()
q2.async {
print("q2 calling leave")
group.leave()
}
Output:
q1 is waiting
q2 calling leave
event occurred
You generally want to avoid blocking threads on iOS if possible as there is a risk of deadlocks and if you block the main thread your whole app will become non responsive. It is more common to use notify to schedule code to execute when the group becomes empty.
I understand that your code is simply a contrived example, but depending on what you actually want to do and your minimum supported iOS requirements, there may be better alternatives.
DispatchGroup to execute code when several asynchronous tasks are complete using notify rather than wait
Combine to process asynchronous events in a pipeline (iOS 13+)
Async/Await (iOS 15+)

running asynchronous tasks in a synchronous order

I have a list of files i need to upload to my server.
I want to upload each file only if the file before it uploaded successfully.
I'm looking for an elegant way to implement this.
For example using coroutines like feature.
So is there a feature like coroutines in swift?
Is there any other elegant way to implement this?
Thanks
You could use an OperationQueue. Create one like this:
lazy var queue: OperationQueue = {
let queue = OperationQueue()
queue.maxConcurrentOperationCount = 1
return queue
}()
and then add an operation to it like this:
self.queue.addOperation {
// The code you want run in the background
}
Having set the maxConcurrentOperationCount to 1 it operates as a serial queue not running the next task until the current one has finished.
That's the most basic functionality but there are all kinds of more advanced things you can do so check out the documentation OperationQueue
Why not create a List of files and on the completion handler of one upload just check if the list is not empty and kick off the next one? Rather than a complicated coroutines setup I think it can be simpler by just maintaining an upload list/queue.
You can create a serial queue for that, using Grand Central Dispatch. Compared to OperationQueue suggested in on of the answers given here it has the advantage of saving quite an amount of overhead. Please see this post for details on that.
This creates the serial queue:
let serialQueue = DispatchQueue(label: "someQueueIdentifier", qos: .background)
Choose the quality of service parameter according to your needs.
Then, you could have a loop, for example, from which you place your background jobs into this queue:
while <condition> {
serialQueue.async {
// do background job
}
}
A serial queue will run one job at a time and the whole queue will be executed asynchronously in the background.

Why we need the synchronous operation in ios

I want to know As we all know how asynchronous task are necessary for concurrency but Wanted to know why we need the synchronous tasks. while we can achieve the same with the normal usage of function.
Thanks & regards
Rohit
When you calls something synchronously, it means that 'the thread that initiated that operation will wait for the task to finish before
continuing'. Asynchronous means that it will not wait for finish the task.
synchronous calls stops your current action and returns when the call returned. with asynchronous calls you can continue.
synchronous is the opposite of asynchronous code, and therefore is ordinary code.
At the end, if asynchronous is totally out of scope then you will not emphasize the word synchronous.
It helps to synchronise threads, as the name suggests.
consider a typical usage of GCD async and sync (pseudo)
async background_thread {
//1 call webservice or other long task that would block the main thread
sync main_thread {
//2 update UI with results from 1
}
//3 do something else that relies on 2
}
now if 2 was in an async and you needed to do something at 3 that relies on the updates at 2 to have happened, then you are not guaranteed (and most likely wont) get the behaviour you are expecting. instead, you use a sync to make sure that the task is completed before continuing the execution in the background thread.
If you are asking now, why not just take out the sync/async around 2 so it executes in order anyway? the problem is, the UI must not be updated on a background thread otherwise the behaviour is undefined (which usually means the UI lags a lot). So in essence what happens is the background thread waits at 2's sync until the main thread gets round to executing that block, then it will continue with the rest of the execution on the background thread.
If you were dealing with a task that doesnt require the main thread (or some other thread) to execute properly, then yes you may as well take out the sync at 2.
This is just one example of how a sync is useful, there are others if you are doing advanced threading in your app.
Hope this helps
Typically it's because you want to do an operation on a specific different thread but you need the result of that operation. You cannot do the operation asynchronously because your code will proceed before the operation on the other thread completes.
Apple has a very nice example:
func asset() -> AVAsset? {
var theAsset : AVAsset!
self.assetQueue.sync {
theAsset = self.getAssetInternal().copy() as! AVAsset
}
return theAsset
}
Any thread might call the asset method; but to avoid problems with shared data, we require that only functions that are executed from a particular queue (self.assetQueue) may touch an AVAsset, so when we call getAssetInternal we do it on self.assetQueue. But we also need the result returned by our call to getAssetInternal; hence the call to sync rather than async.

NSOperationQueue - running synchronously

I need to make some API calls and I want to ensure that they come back in the order that they went out. Is this the proper flow to have that happen?
Create NSOperationQueue, set max concurrent operations to 1
Create URL String to API
Create NSOperation block, call method to call API, pass URL string
Add NSOperation to NSOperationQueue
This is where I get confused. Setting the max concurrent operations to 1 essentially makes NSOperationQueue into a synchronous queue, only 1 operation gets called at a time. However, each operation is going to make a NSURLSession call, which is async. How can I ensure that the next operation doesn't run until I have finished with the first? (By finish I want to store the returned JSON in a NSArray, adding each additional returned JSON to that array).
The proper way to ensure that NSOperations run in order is to add dependencies. Dependencies are powerful as they allow ordering of different operations on different queues. You can make an API call or data processing on a background queue; when complete, a dependent operation can update the UI on the main thread.
let operation1 = NSBlockOperation {
print("Run First - API Call")
}
let operation2 = NSBlockOperation {
print("Run Second - Update UI")
}
operation2.addDependency(operation1)
let backgroundQueue = NSOperationQueue()
backgroundQueue.addOperation(operation1)
NSOperationQueue.mainQueue().addOperation(operation2)
// operation1 will finish before operation2 is called, regardless of what queue they're in
See Apple Docs on addDependency in NSOperation here: https://developer.apple.com/library/mac/documentation/Cocoa/Reference/NSOperation_class/index.html#//apple_ref/occ/cl/NSOperation
Also, be careful with assuming that maxConcurrentOperationCount = 1 as all that it does is ensure that only 1 operation will run at a time. This does NOT ensure the order of the queue. An operation with a higher priority will likely run first.

iOS dispatch_get_global_queue nested inside dispatch_get_main_queue

I've inherited a codebase that's using the following structure for threading:
dispatch_async(dispatch_get_main_queue(), { () -> Void in
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { () -> Void in
//Several AFNetworking Server calls...
})
})
I'm not very experienced with threading, so I'm trying to figure out what the possible intention behind this structure. Why grab the main queue only to access another queue immediately? Is this a common practice? For a little more context, this code is executed in an UIApplicationDidBecomeActiveNotification notification, making several necessary service calls.
Is this structure safe? Essentially my goal is to make the service calls without blocking the UI. Any help or input is appreciated.
So I think this is an interesting few lines that somebody decided to write, so let's break down what's happening here (I may be breaking things down too much, sorry in advance, it just helps my own train of thought)
dispatch_async(dispatch_get_main_queue(), dispatch_block_t block)
This will put the block as a task on the main queue (which you the code is already running in), then immediately continue executing the code in the rest of the method (If he had wanted to wait for the block task to finish before continuing, he'd have made a dispatch_sync call instead).
The main queue is serial, so it will perform these tasks exactly in this order:
go ahead and execute the block after the end of the current method (the end of the run loop for the current task)
execute any other tasks that may have been asynchronously added to the main queue before you dispatch_async your block task into the queue
execute the block task
Now block just dispatches another task to the high priority global queue.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), block2)
The DISPATCH_QUEUE_PRIORITY_HIGH is a concurrent queue-- so if you were to dispatch multiple tasks to this queue, it could potentially do them in parallel, depending on several system factors.
Your old co-worker wanted to make sure the networking calls in block2 were done ASAP
Because block is calling dispatch_async (which returns immediately), block task finishes, allowing the main queue to execute the next task in the queue.
The net result so far is that block2 is queued into the high priority global queue. After it executes, and your network calls complete, callback methods will be called and yadayada
...So what is the order of what's happening?
dispatch_async(dispatch_get_main_queue(), { () -> Void in
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { () -> Void in
//Several AFNetworking Server calls...
})
})
//moreCode
1) moreCode executes
2) block executes (adds block2 with network calls onto global queue)
3/4) Next task in main queue executes
4/3) Network task in global queue executes
The order of which would happen first may vary between 3 and 4, but that's concurrency for you :)
So unless old coworker wanted moreCode to execute first before adding the network calls to a global queue, you can go ahead and remove that initial dispatch_async into the main queue.
Assuming it looks like they wanted the network calls done ASAP, there probably is no reason to delay the addition of those networking tasks into a global queue.
Open to any input ^^. My experience involves reading all of the documentation on GCD today, then deciding to look at some GCD tagged questions

Resources