Weak DispatchGroup in closures and other GCD questions - ios

Swift closures strongly capture reference types.
DispatchGroup is a reference type.
My questions have to do with the following code:
func getUsername(onDone: #escaping (_ possUsername: String?) -> ())
{
//Post request for username that calls onDone(retrievedUsername)...
}
func getBirthdate(using username: String?, onDone: #escaping (_ possBday: String?) -> ())
{
//Post request for token that calls onDone(retrievedToken)...
}
func asyncTasksInOrder(onDone: #escaping (_ resultBDay: String?) -> ())
{
let thread = DispatchQueue(label: "my thread", qos: .userInteractive, attributes: [],
autoreleaseFrequency: .workItem, target: nil)
thread.async { [weak self, onDone] in
guard let self = self else {
onDone(nil)
return
}
let dg = DispatchGroup() //This is a reference type
var retrievedUsername: String?
var retrievedBday: String?
//Get username async first
dg.enter()
self.getUsername(onDone: {[weak dg](possUsername) in
retrievedUsername = possUsername
dg?.leave() //DG is weak here
})
dg.wait()
//Now that we've waited for the username, get bday async now
dg.enter()
self.getBirthdate(using: retrievedUsername, onDone: {[weak dg](possBday) in
retrievedBday = possBday
dg?.leave() //DG is also weak here
})
dg.wait()
//We've waited for everything, so now call the return callback
onDone(retrievedBday)
}
}
So the two closures inside of asyncTasksInOrder(onDone:) each capture dg, my DispatchGroup.
Is it even necessary to capture my dispatch group?
If I don't capture it, how would I even know I've got a retain cycle?
What if the dispatch group evaporates during one of the callback executions? Would it even evaporate since it's waiting?
Is it unnecessarily expensive to instantiate a DispatchQueue like this often (disregarding the .userInteractive)? I'm asking this particular question because spinning up threads in Android is extremely expensive (so expensive that JetBrains has dedicated lots of resources towards Kotlin coroutines).
How does dg.notify(...) play into all of this? Why even have a notify method when dg.wait() does the same thing?
I feel like my understanding of GCD is not bulletproof, so I'm asking to build some confidence. Please critique as well if there's anything to critique. The help is truly appreciated.

1) No, the dispatch group is captured implicitly. You don't even need to capture self in async because GCD closures don't cause retain cycles.
2) There is no retain cycle.
3) Actually you are misusing DispatchGroup to force an asynchronous task to become synchronous.
4) No, GCD is pretty lightweight.
5) The main purpose of DispatchGroup is to notify when all asynchronous tasks – for example in a repeat loop – are completed regardless of the order.
A better solution is to nest the asynchronous tasks. With only two tasks the pyramid of doom is manageable.
func asyncTasksInOrder(onDone: #escaping (String?) -> Void)
{
let thread = DispatchQueue(label: "my thread", qos: .userInteractive, autoreleaseFrequency: .workItem)
thread.async {
//Get username async first
self.getUsername { [weak self] possUsername in
guard let self = self else { onDone(nil); return }
//Now get bday async
self.getBirthdate(using: possUsername) { possBday in
//Now call the return callback
onDone(possBday)
}
}
}
}

Related

CompletionHandler Behavior on Main vs Background thread

Does the thread from which a method is being used matter in terms of completionHandlers? When running on the Main thread, I have no problems; the (data) in completionHandler is called and thus the function's completionHandler within that.
The following is my code:
static func getInfo(user: User,completion: (([String:String]) -> Void)? = nil){
print(user.apiUid)
var enrolledCourses: [String:String] = [:]
NetworkingManager.staticGeneralRequest(withEndpoint: someURL, oauthToken: user.oauth_token, oauthTokenSecret: user.oauth_token_secret) { (data) in
let decodedData: [String:String] = data.getData()
completion?(decodedData)
}
}
//NOTE: According to the compiler,
//the use of an optional completionHandler means that it's `#escaping` by default.
Is this an issue pretaining to threading, and what are some best practices to ensure that code works well across threads?
Your closure was deallocated before you get the response. Adding #escaping like this: static func getInfo(user: User, completion: #escaping (([String:String]) -> Void)? = nil) to keep it in scope.

Completion handlers and Operation queues

I am trying to do the following approach,
let operationQueue = OperationQueue()
operationQueue.maxConcurrentOperationCount = 10
func registerUser(completionHandler: #escaping (Result<Data, Error>) -> Void) -> String {
self.registerClient() { (result) in
switch result {
case .success(let data):
self.downloadUserProfile(data.profiles)
case .failure(let error):
return self.handleError(error)
}
}
}
func downloadUserProfile(urls: [String]) {
for url in urls {
queue.addOperation {
self.client.downloadTask(with: url)
}
}
}
I am checking is there anyway I can get notified when all operations gets completed and then I can call the success handler there.
I tried checking the apple dev documentation which suggests to use
queue.addBarrierBlock {
<#code#>
}
but this is available only from iOS 13.0
Pre iOS 13, we’d use dependencies. Declare a completion operation, and then when you create operations for your network requests, you’d define those operations to be dependencies for your completion operation.
let completionOperation = BlockOperation { ... }
let networkOperation1 = ...
completionOperation.addDependency(networkOperation1)
queue.addOperation(networkOperation1)
let networkOperation2 = ...
completionOperation.addDependency(networkOperation2)
queue.addOperation(networkOperation2)
OperationQueue.main.addOperation(completionOperation)
That having been said, you should be very careful with your operation implementation. Do I correctly infer that downloadTask(with:) returns immediately after the download task has been initiated and doesn’t wait for the request to finish? In that case, neither dependencies nor barriers will work the way you want.
When wrapping network requests in an operation, you’d want to make sure to use an asynchronous Operation subclass (e.g. https://stackoverflow.com/a/32322851/1271826).
The pre-iOS 13 way is to observe the operationCount property of the operation queue
var observation : NSKeyValueObservation?
...
observation = operationQueue.observe(\.operationCount, options: [.new]) { observed, change in
if change.newValue == 0 {
print("operations finished")
}
}
}

How to suspend dispatch queue inside for loop?

I have play and pause button. When I pressed play button, I want to play async talking inside for loop. I used dispatch group for async method's waiting inside for loop. But I cannot achieve pause.
startStopButton.rx.tap.bind {
if self.isPaused {
self.isPaused = false
dispatchGroup.suspend()
dispatchQueue.suspend()
} else {
self.isPaused = true
self.dispatchQueue.async {
for i in 0..<self.textBlocks.count {
self.dispatchGroup.enter()
self.startTalking(string: self.textBlocks[i]) { isFinished in
self.dispatchGroup.leave()
}
self.dispatchGroup.wait()
}
}
}
}.disposed(by: disposeBag)
And i tried to do with operationqueue but still not working. It is still continue talking.
startStopButton.rx.tap.bind {
if self.isPaused {
self.isPaused = false
self.talkingQueue.isSuspended = true
self.talkingQueue.cancelAllOperations()
} else {
self.isPaused = true
self.talkingQueue.addOperation {
for i in 0..<self.textBlocks.count {
self.dispatchGroup.enter()
self.startTalking(string: self.textBlocks[i]) { isFinished in
self.dispatchGroup.leave()
}
self.dispatchGroup.wait()
}
}
}
}.disposed(by: disposeBag)
Is there any advice?
A few observations:
Pausing a group doesn’t do anything. You suspend queues, not groups.
Suspending a queue stops new items from starting on that queue, but it does not suspend anything already running on that queue. So, if you’ve added all the textBlock calls in a single dispatched item of work, then once it’s started, it won’t suspend.
So, rather than dispatching all of these text blocks to the queue as a single task, instead, submit them individually (presuming, of course, that your queue is serial). So, for example, let’s say you had a DispatchQueue:
let queue = DispatchQueue(label: "...")
And then, to queue the tasks, put the async call inside the for loop, so each text block is a separate item in your queue:
for textBlock in textBlocks {
queue.async { [weak self] in
guard let self = self else { return }
let semaphore = DispatchSemaphore(value: 0)
self.startTalking(string: textBlock) {
semaphore.signal()
}
semaphore.wait()
}
}
FYI, while dispatch groups work, a semaphore (great for coordinating a single signal with a wait) might be a more logical choice here, rather than a group (which is intended for coordinating groups of dispatched tasks).
Anyway, when you suspend that queue, the queue will be preventing from starting anything queued (but will finish the current textBlock).
Or you can use an asynchronous Operation, e.g., create your queue:
let queue: OperationQueue = {
let queue = OperationQueue()
queue.name = "..."
queue.maxConcurrentOperationCount = 1
return queue
}()
Then, again, you queue up each spoken word, each respectively a separate operation on that queue:
for textBlock in textBlocks {
queue.addOperation(TalkingOperation(string: textBlock))
}
That of course assumes you encapsulated your talking routine in an operation, e.g.:
class TalkingOperation: AsynchronousOperation {
let string: String
init(string: String) {
self.string = string
}
override func main() {
startTalking(string: string) {
self.finish()
}
}
func startTalking(string: String, completion: #escaping () -> Void) { ... }
}
I prefer this approach because
we’re not blocking any threads;
the logic for talking is nicely encapsulated in that TalkingOperation, in the spirit of the single responsibility principle; and
you can easily suspend the queue or cancel all the operations.
By the way, this is a subclass of an AsynchronousOperation, which abstracts the complexity of asynchronous operation out of the TalkingOperation class. There are many ways to do this, but here’s one random implementation. FWIW, the idea is that you define an AsynchronousOperation subclass that does all the KVO necessary for asynchronous operations outlined in the documentation, and then you can enjoy the benefits of operation queues without making each of your asynchronous operation subclasses too complicated.
For what it’s worth, if you don’t need suspend, but would be happy just canceling, the other approach is to dispatching the whole for loop as a single work item or operation, but check to see if the operation has been canceled inside the for loop:
So, define a few properties:
let queue = DispatchQueue(label: "...")
var item: DispatchWorkItem?
Then you can start the task:
item = DispatchWorkItem { [weak self] in
guard let textBlocks = self?.textBlocks else { return }
for textBlock in textBlocks where self?.item?.isCancelled == false {
let semaphore = DispatchSemaphore(value: 0)
self?.startTalking(string: textBlock) {
semaphore.signal()
}
semaphore.wait()
}
self?.item = nil
}
queue.async(execute: item!)
And then, when you want to stop it, just call item?.cancel(). You can do this same pattern with a non-asynchronous Operation, too.

How to achieve DispatchGroup functionality using OperationQueue? [duplicate]

I have an Operation subclass and Operation queue with maxConcurrentOperationCount = 1.
This performs my operations in a sequential order that i add them which is good but now i need to wait until all operations have finished before running another process.
i was trying to use notification group but as this is run in a for loop as soon as the operations have been added to the queue the notification group fires.. How do i wait for all operations to leave the queue before running another process?
for (index, _) in self.packArray.enumerated() {
myGroup.enter()
let myArrayOperation = ArrayOperation(collection: self.outerCollectionView, id: self.packArray[index].id, count: index)
myArrayOperation.name = self.packArray[index].id
downloadQueue.addOperation(myArrayOperation)
myGroup.leave()
}
myGroup.notify(queue: .main) {
// do stuff here
}
You can use operation dependencies to initiate some operation upon the completion of a series of other operations:
let queue = OperationQueue()
let completionOperation = BlockOperation {
// all done
}
for object in objects {
let operation = ...
completionOperation.addDependency(operation)
queue.addOperation(operation)
}
OperationQueue.main.addOperation(completionOperation) // or, if you don't need it on main queue, just `queue.addOperation(completionOperation)`
Or, in iOS 13 and later, you can use barriers:
let queue = OperationQueue()
for object in objects {
queue.addOperation(...)
}
queue.addBarrierBlock {
DispatchQueue.main.async {
// all done
}
}
A suitable solution is KVO
First before the loop add the observer (assuming queue is the OperationQueue instance)
queue.addObserver(self, forKeyPath:"operations", options:.new, context:nil)
Then implement
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if object as? OperationQueue == queue && keyPath == "operations" {
if queue.operations.isEmpty {
// Do something here when your queue has completed
self.queue.removeObserver(self, forKeyPath:"operations")
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
Edit:
In Swift 4 it's much easier
Declare a property:
var observation : NSKeyValueObservation?
and create the observer
observation = queue.observe(\.operationCount, options: [.new]) { [unowned self] (queue, change) in
if change.newValue! == 0 {
// Do something here when your queue has completed
self.observation = nil
}
}
Since iOS13 and macOS15 operationCount is deprecated. The replacement is to observe progress.completedUnitCount.
Another modern way is to use the KVO publisher of Combine
var cancellable: AnyCancellable?
cancellable = queue.publisher(for: \.progress.completedUnitCount)
.filter{$0 == queue.progress.totalUnitCount}
.sink() { _ in
print("queue finished")
self.cancellable = nil
}
I use the next solution:
private let queue = OperationQueue()
private func addOperations(_ operations: [Operation], completionHandler: #escaping () -> ()) {
DispatchQueue.global().async { [unowned self] in
self.queue.addOperations(operations, waitUntilFinished: true)
DispatchQueue.main.async(execute: completionHandler)
}
}
Set the maximum number of concurrent operations to 1
operationQueue.maxConcurrentOperationCount = 1
then each operation will be executed in order (as if each was dependent on the previous one) and your completion operation will execute at the end.
Code at the end of the queue
refer to this link
NSOperation and NSOperationQueue are great and useful Foundation framework tools for asynchronous tasks. One thing puzzled me though: How can I run code after all my queue operations finish? The simple answer is: use dependencies between operations in the queue (unique feature of NSOperation). It's just 5 lines of code solution.
NSOperation dependency trick
with Swift it is just easy to implement as this:
extension Array where Element: NSOperation {
/// Execute block after all operations from the array.
func onFinish(block: () -> Void) {
let doneOperation = NSBlockOperation(block: block)
self.forEach { [unowned doneOperation] in doneOperation.addDependency($0) }
NSOperationQueue().addOperation(doneOperation)
}}
My solution is similar to that of https://stackoverflow.com/a/42496559/452115, but I don't add the completionOperation in the main OperationQueue but into the queue itself. This works for me:
var a = [Int](repeating: 0, count: 10)
let queue = OperationQueue()
let completionOperation = BlockOperation {
print(a)
}
queue.maxConcurrentOperationCount = 2
for i in 0...9 {
let operation = BlockOperation {
a[i] = 1
}
completionOperation.addDependency(operation)
queue.addOperation(operation)
}
queue.addOperation(completionOperation)
print("Done 🎉")

How can we implement concurrency thread using protocol in swift? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I was asked this question in interview for iOS developer role.
// Please design a read-write task queue where you can tag the reader task with label,
// where the the task with the same label should be executed sequentially, and the
// tasks with different labels could be executed concurrently. However, the writer
// would get the exclusive access where no concurrent task would happen at the
// same time with the writer task
// For example:
protocol ConcurrentQueueWithSerialization {
// Submits a labeled task.
// All tasks with the same label will be serialized.
// Tasks with different labels will run concurrently.
// Use this method to submit a "read" operation from a particular reader.
func async(with label: String, task: #escaping () -> Void)
// Submits a task that will run concurrently with all other tasks regardless of their labels.
func async(task: #escaping () -> Void)
// Submits a labeled and delayed task.
func asyncAfter(deadline: DispatchTime, with label: String, task: #escaping () -> Void)
// Submits an unlabeled and delayed task.
func asyncAfter(deadline: DispatchTime, task: #escaping () -> Void)
// Submits a barrier task. Only one barrier task is allowed to run at a time.
// Works as a critical section for the queue.
// Use this method to submit a writer task.
func asyncBarrier(task: #escaping () -> Void)
}
class MyDispatchQueue: ConcurrentQueueWithSerialization {
//TODO: write your implementation
}
Interviewer asked me to implement above protocol in MyDispatchQueue class. I tried but could not find solution. Please help me. Thanks in advance.
Previously I suggested using target queues, but even better, create a primary concurrent queue, and then create serial queues for the named queues, and then dispatch everything through that primary concurrent queue. Unlike the target queue approach, this will honor the scheduling of tasks dispatched to the named queues with those dispatched to the unnamed queue.
With that implementation, here's an example (an Instruments "Points of Interest" profile) of this where I added tasks for queues named "fred" and "ginger" and also one which was added to an unnamed queue, I then added a barrier task, and then added two more tasks to each of the aforementioned queues.
As you can see, it respects the serial nature of the named queues, the unnamed queue is concurrent, and all these queues are concurrent with respect to each other, but the barrier is a barrier across all the queues.
class MyDispatchQueue: ConcurrentQueueWithSerialization {
private var namedQueues = [String: DispatchQueue]()
private var queue = DispatchQueue(label: Bundle.main.bundleIdentifier! + ".target", attributes: .concurrent)
private let lock = NSLock()
private func queue(with label: String) -> DispatchQueue {
lock.lock()
defer { lock.unlock() }
if let queue = namedQueues[label] { return queue }
let queue = DispatchQueue(label: Bundle.main.bundleIdentifier! + "." + label)
namedQueues[label] = queue
return queue
}
func async(with label: String, task: #escaping () -> Void) {
queue.async {
self.queue(with: label).sync(execute: task)
}
}
func async(task: #escaping () -> Void) {
queue.async(execute: task)
}
func asyncAfter(deadline: DispatchTime, with label: String, task: #escaping () -> Void) {
queue.asyncAfter(deadline: deadline) {
self.queue(with: label).sync(execute: task)
}
}
func asyncAfter(deadline: DispatchTime, task: #escaping () -> Void) {
queue.asyncAfter(deadline: deadline, execute: task)
}
func asyncBarrier(task: #escaping () -> Void) {
queue.async(flags: .barrier, execute: task)
}
}
Note, I also synchronize access to the namedQueues array, to ensure the thread-safety of this class.

Resources