Completion handlers and Operation queues - ios

I am trying to do the following approach,
let operationQueue = OperationQueue()
operationQueue.maxConcurrentOperationCount = 10
func registerUser(completionHandler: #escaping (Result<Data, Error>) -> Void) -> String {
self.registerClient() { (result) in
switch result {
case .success(let data):
self.downloadUserProfile(data.profiles)
case .failure(let error):
return self.handleError(error)
}
}
}
func downloadUserProfile(urls: [String]) {
for url in urls {
queue.addOperation {
self.client.downloadTask(with: url)
}
}
}
I am checking is there anyway I can get notified when all operations gets completed and then I can call the success handler there.
I tried checking the apple dev documentation which suggests to use
queue.addBarrierBlock {
<#code#>
}
but this is available only from iOS 13.0

Pre iOS 13, we’d use dependencies. Declare a completion operation, and then when you create operations for your network requests, you’d define those operations to be dependencies for your completion operation.
let completionOperation = BlockOperation { ... }
let networkOperation1 = ...
completionOperation.addDependency(networkOperation1)
queue.addOperation(networkOperation1)
let networkOperation2 = ...
completionOperation.addDependency(networkOperation2)
queue.addOperation(networkOperation2)
OperationQueue.main.addOperation(completionOperation)
That having been said, you should be very careful with your operation implementation. Do I correctly infer that downloadTask(with:) returns immediately after the download task has been initiated and doesn’t wait for the request to finish? In that case, neither dependencies nor barriers will work the way you want.
When wrapping network requests in an operation, you’d want to make sure to use an asynchronous Operation subclass (e.g. https://stackoverflow.com/a/32322851/1271826).

The pre-iOS 13 way is to observe the operationCount property of the operation queue
var observation : NSKeyValueObservation?
...
observation = operationQueue.observe(\.operationCount, options: [.new]) { observed, change in
if change.newValue == 0 {
print("operations finished")
}
}
}

Related

Creating a Combine's publisher like RxSwift's Observable.Create for an Alamofire request

I use the following piece of code to generate a cold RxSwift Observable:
func doRequest<T :Mappable>(request:URLRequestConvertible) -> Observable<T> {
let observable = Observable<T>.create { [weak self] observer in
guard let self = self else { return Disposables.create() }
self.session.request(request).validate().responseObject { (response: AFDataResponse<T>) in
switch response.result {
case .success(let obj):
observer.onNext(obj)
observer.onCompleted()
case .failure(let error):
let theError = error as Error
observer.onError(theError)
}
}
return Disposables.create()
}
return observable
}
where Mappable is an ObjectMapper based type, and self.session is an Alamofire's Session object.
I can't find an equivalent to Observable.create {...} in Apple's Combine framework. What I only found is URLSession.shared.dataTaskPublisher(for:) which creates a publisher using Apple's URLSession class.
How can I convert the above observable to an Alamofire Combine's publisher ?
EDIT:
using the solution provided by rob, I ended up with the following:
private let apiQueue = DispatchQueue(label: "API", qos: .default, attributes: .concurrent)
func doRequest<T>(request: URLRequestConvertible) -> AnyPublisher<T, AFError> where T : Mappable {
Deferred { [weak self] () -> Future<T, AFError> in
guard let self = self else {
return Future<T, AFError> { promise in
promise(.failure(.explicitlyCancelled)) }
}
return Future { promise in
self.session
.request(request)
.validate()
.responseObject { (response: AFDataResponse<T>) in
promise(response.result)
}
}
}
.handleEvents(receiveCompletion: { completion in
if case .failure (let error) = completion {
//handle the error
}
})
.receive(on: self.apiQueue)
.eraseToAnyPublisher()
}
EDIT2: I have to remove the private queue since it's not needed, Alamofire does the parsing the decoding on its own, so remove the queue and its usages (.receive(on: self.apiQueue))
You can use Future to connect responseObject's callback to a Combine Publisher. I don't have Alamofire handy for testing, but I think the following should work:
func doRequest<T: Mappable>(request: URLRequestConvertible) -> AnyPublisher<T, AFError> {
return Future { promise in
self.session
.request(request)
.validate()
.responseObject { (response: AFDataResponse<T>) in
promise(response.result)
}
}.eraseToAnyPublisher()
}
Note that this is somewhat simpler than the RxSwift version because promise takes a Result directly, so we don't have to switch over response.result.
A Future is sort of a “lukewarm” publisher. It is like a hot observable because it executes its body immediately and only once, so it starts the Alamofire request immediately. It is also like a cold observable, because every subscriber eventually receives a value or an error (assuming you eventually call promise). The Future only executes its body once, but it caches the Result you pass to promise.
You can create a truly cold publisher by wrapping the Future in a Deferred:
func doRequest<T: Mappable>(request: URLRequestConvertible) -> AnyPublisher<T, AFError> {
return Deferred {
Future { promise in
self.session
.request(request)
.validate()
.responseObject { (response: AFDataResponse<T>) in
promise(response.result) }
}
}.eraseToAnyPublisher()
}
Deferred calls its body to create a new inner Publisher every time you subscribe to it. So each time you subscribe, you'll create a new Future that will immediately start a new Alamofire request. This is useful if you want to use the retry operator, as in this question.

Downloading Images in order of url's - iOS Swift

I have 10 urls in an array and when 4 of them downloaded I need to display them. Im using Semaphores and groups to implement . But looks like im hitting deadlock. Not sure how to proceed. Please advice how I can
Simulating same in playground:
PlaygroundPage.current.needsIndefiniteExecution = true
let group = DispatchGroup()
let queue = DispatchQueue.global(qos: .userInteractive)
let semaphore = DispatchSemaphore(value: 4)
var nums: [Int] = []
for i in 1...10 {
group.enter()
semaphore.wait()
queue.async(group: group) {
print("Downloading image \(i)")
// Simulate a network wait
Thread.sleep(forTimeInterval: 3)
nums.append(i)
print("Hola image \(i)")
if nums.count == 4 {
print("4 downloaded")
semaphore.signal()
group.leave()
}
}
if nums.count == 4 {
break
}
}
group.notify(queue: DispatchQueue.main) {
print(nums)
}
I get this in o/p console
> Downloading image 1
> Downloading image 2
> Downloading image 3
> Downloading image 4
Semaphores(41269,0x70000ade5000) malloc: *** error for object 0x1077d4750: pointer being freed was not allocated
Semaphores(41269,0x70000ade5000) malloc: *** set a breakpoint in malloc_error_break to debug
I'm expecting to print [1,2,3,4] in order
I know im trying to access a shared resource in async but not sure how I can fix this. Please advice
Also How can I use this with semaphore's if I want to download 4,4,2 tasks at a time so it display [1,2,3,4,5,6,7,8,9,10] in my ouput
Your title says “Downloading Images in order of url’s”, but your code snippet is not attempting to do that. It appears to be attempting to use semaphores to constrain the download to four images at a time, but it won’t guarantee that they’ll be in order.
It is commendable that this code snippet isn’t attempting to download them in order, sequentially, one after another, because that would impose a huge performance penalty. It is also good that this code snippet is constraining this degree of concurrency to something reasonable, thereby avoiding exhausting worker threads or causing some of the latter requests to timeout. So, the idea of using semaphore to allow concurrent image download, but constrain it to four at a time, is a fine approach; we only need to sort the results at the end if you want them in order.
But before we get to that, let’s tackle a bunch of problems in the supplied code snippet:
You are calling group.enter() and semaphore.wait() for every iteration (which is correct), but group.leave() and semaphore.signal() only when i is 4 (which is not correct). You want to leave and signal for every iteration.
Obviously, that break call is not needed, either.
So, to fix this “do four at a time” process, one can simplify this code:
let group = DispatchGroup()
let queue = DispatchQueue.global(qos: .userInteractive)
let semaphore = DispatchSemaphore(value: 4)
var nums: [Int] = []
for i in 1...10 {
group.enter()
semaphore.wait()
queue.async() { // NB: the `group` parameter is not needed
print("Downloading image \(i)")
// Simulate a network wait
Thread.sleep(forTimeInterval: 3)
nums.append(i)
print("Hola image \(i)")
semaphore.signal()
group.leave()
}
}
group.notify(queue: .main) {
print(nums)
}
That will download four images at a time and will call your group.notify closure when they’re all done.
While the above fixes the semaphore and group logic, there is yet another problem lurking in the above code snippet. It is updating that nums array from multiple background threads, but Array is not thread-safe. So you should synchronize those updates to that array. An easy way to achieve this is to dispatch that update back to the main thread. (Any serial queue would have been fine, but the main thread works fine for this purpose.)
Also, since one should never call wait on the main queue, so I’d suggest that you explicitly dispatch this entire for loop to a background thread:
DispatchQueue.global(qos: .utility).async {
let group = DispatchGroup()
let queue = DispatchQueue.global(qos: .userInteractive)
let semaphore = DispatchSemaphore(value: 4)
var nums: [Int] = []
for i in 1...10 {
group.enter()
semaphore.wait()
queue.async() {
print("Downloading image \(i)")
// Simulate a network wait
Thread.sleep(forTimeInterval: 3)
DispatchQueue.main.async {
nums.append(i)
print("Hola image \(i)")
}
semaphore.signal()
group.leave()
}
}
group.notify(queue: .main) {
print(nums)
}
}
That is now the correct “do four at a time and let me know when it’s done.”
OK, now that we’re downloading all of the images properly, let’s figure out how to sort the results. Frankly, I think it’s easier to follow what’s going on if we imagine that we have some image download method, like so, that downloads a particular image:
func download(_ url: URL, completion: #escaping (Result<UIImage, Error>) -> Void) { ... }
Then the routine to (a) download the images, no more than four at a time; and (b) return the results back in order, might look like:
func downloadAllImages(_ urls: [URL], completion: #escaping ([UIImage]) -> Void) {
DispatchQueue.global(qos: .utility).async {
let group = DispatchGroup()
let semaphore = DispatchSemaphore(value: 4)
var imageDictionary: [URL: UIImage] = [:]
// download the images
for url in urls {
group.enter()
semaphore.wait()
self.download(url) { result in
defer {
semaphore.signal()
group.leave()
}
switch result {
case .failure(let error):
print(error)
case .success(let image):
DispatchQueue.main.async {
imageDictionary[url] = image
}
}
}
}
// now sort the results
group.notify(queue: .main) {
completion(urls.compactMap { imageDictionary[$0] })
}
}
}
And you’d call it like so:
downloadAllImages(urls) { images in
self.images = images
self.updateUI() // do whatever you want to trigger the update of the UI
}
FWIW, the “download single image” routine might look like:
enum DownloadError: Error {
case notImage
case invalidStatusCode(URLResponse)
}
func download(_ url: URL, completion: #escaping (Result<UIImage, Error>) -> Void) {
URLSession.shared.dataTask(with: url) { data, response, error in
guard let data = data, let response = response as? HTTPURLResponse, error == nil else {
completion(.failure(error!))
return
}
guard 200..<300 ~= response.statusCode else {
completion(.failure(DownloadError.invalidStatusCode(response)))
return
}
guard let image = UIImage(data: data) else {
completion(.failure(DownloadError.notImage))
return
}
completion(.success(image))
}
}
And this is using the Swift 5 Result enumeration. If you’re using an earlier version of Swift, you can define a simple rendition of this enum yourself:
enum Result<Success, Failure> {
case success(Success)
case failure(Failure)
}
Finally, it’s worth noting a few other alternatives:
Wrap your network request in asynchronous Operation subclass and add them to an operation queue whose maxConcurrentOperationCount is set to 4. If you’re interested in this approach, I can supply some references.
Use an image downloading library like Kingfisher.
Instead of manual downloading of all the images, use the UIImageView extension (such as provided by Kingfisher) and completely abandon the “download all images” process at all, and move to a pattern where you simply instruct your image views to asynchronously retrieve the images in either a just-in-time manner (or prefetching).

How to achieve DispatchGroup functionality using OperationQueue? [duplicate]

I have an Operation subclass and Operation queue with maxConcurrentOperationCount = 1.
This performs my operations in a sequential order that i add them which is good but now i need to wait until all operations have finished before running another process.
i was trying to use notification group but as this is run in a for loop as soon as the operations have been added to the queue the notification group fires.. How do i wait for all operations to leave the queue before running another process?
for (index, _) in self.packArray.enumerated() {
myGroup.enter()
let myArrayOperation = ArrayOperation(collection: self.outerCollectionView, id: self.packArray[index].id, count: index)
myArrayOperation.name = self.packArray[index].id
downloadQueue.addOperation(myArrayOperation)
myGroup.leave()
}
myGroup.notify(queue: .main) {
// do stuff here
}
You can use operation dependencies to initiate some operation upon the completion of a series of other operations:
let queue = OperationQueue()
let completionOperation = BlockOperation {
// all done
}
for object in objects {
let operation = ...
completionOperation.addDependency(operation)
queue.addOperation(operation)
}
OperationQueue.main.addOperation(completionOperation) // or, if you don't need it on main queue, just `queue.addOperation(completionOperation)`
Or, in iOS 13 and later, you can use barriers:
let queue = OperationQueue()
for object in objects {
queue.addOperation(...)
}
queue.addBarrierBlock {
DispatchQueue.main.async {
// all done
}
}
A suitable solution is KVO
First before the loop add the observer (assuming queue is the OperationQueue instance)
queue.addObserver(self, forKeyPath:"operations", options:.new, context:nil)
Then implement
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if object as? OperationQueue == queue && keyPath == "operations" {
if queue.operations.isEmpty {
// Do something here when your queue has completed
self.queue.removeObserver(self, forKeyPath:"operations")
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
Edit:
In Swift 4 it's much easier
Declare a property:
var observation : NSKeyValueObservation?
and create the observer
observation = queue.observe(\.operationCount, options: [.new]) { [unowned self] (queue, change) in
if change.newValue! == 0 {
// Do something here when your queue has completed
self.observation = nil
}
}
Since iOS13 and macOS15 operationCount is deprecated. The replacement is to observe progress.completedUnitCount.
Another modern way is to use the KVO publisher of Combine
var cancellable: AnyCancellable?
cancellable = queue.publisher(for: \.progress.completedUnitCount)
.filter{$0 == queue.progress.totalUnitCount}
.sink() { _ in
print("queue finished")
self.cancellable = nil
}
I use the next solution:
private let queue = OperationQueue()
private func addOperations(_ operations: [Operation], completionHandler: #escaping () -> ()) {
DispatchQueue.global().async { [unowned self] in
self.queue.addOperations(operations, waitUntilFinished: true)
DispatchQueue.main.async(execute: completionHandler)
}
}
Set the maximum number of concurrent operations to 1
operationQueue.maxConcurrentOperationCount = 1
then each operation will be executed in order (as if each was dependent on the previous one) and your completion operation will execute at the end.
Code at the end of the queue
refer to this link
NSOperation and NSOperationQueue are great and useful Foundation framework tools for asynchronous tasks. One thing puzzled me though: How can I run code after all my queue operations finish? The simple answer is: use dependencies between operations in the queue (unique feature of NSOperation). It's just 5 lines of code solution.
NSOperation dependency trick
with Swift it is just easy to implement as this:
extension Array where Element: NSOperation {
/// Execute block after all operations from the array.
func onFinish(block: () -> Void) {
let doneOperation = NSBlockOperation(block: block)
self.forEach { [unowned doneOperation] in doneOperation.addDependency($0) }
NSOperationQueue().addOperation(doneOperation)
}}
My solution is similar to that of https://stackoverflow.com/a/42496559/452115, but I don't add the completionOperation in the main OperationQueue but into the queue itself. This works for me:
var a = [Int](repeating: 0, count: 10)
let queue = OperationQueue()
let completionOperation = BlockOperation {
print(a)
}
queue.maxConcurrentOperationCount = 2
for i in 0...9 {
let operation = BlockOperation {
a[i] = 1
}
completionOperation.addDependency(operation)
queue.addOperation(operation)
}
queue.addOperation(completionOperation)
print("Done 🎉")

Realm notification token on background thread

I was trying to fetch realm data on the background thread and add a notification block (iOS, Swift).
Basic example:
func initNotificationToken() {
DispatchQueue.global(qos: .background).async {
let realm = try! Realm()
results = self.getRealmResults()
notificationToken = results.addNotificationBlock { [weak self] (changes: RealmCollectionChange) in
switch changes {
case .initial:
self?.initializeDataSource()
break
case .update(_, let deletions, let insertions, let modifications):
self?.updateDataSource(deletions: deletions, insertions: insertions, modifications: modifications)
break
case .error(let error):
fatalError("\(error)")
break
}
}
}
}
func initializeDataSource() {
// process the result set data
DispatchQueue.main.async(execute: { () -> Void in
// update UI
})
}
func updateDataSource(deletions: [Int], insertions: [Int], modifications: [Int]) {
// process the changes in the result set data
DispatchQueue.main.async(execute: { () -> Void in
// update UI
})
}
When doing this I get
'Can only add notification blocks from within runloops'
I have to do some more extensive processing with the returned data and would like to only go back to the main thread when updating the UI after the processing is done.
Another way would probably to re-fetch the data after any update on the background thread and then do the processing, but it feels like avoidable overhead.
Any suggestions on the best practice to solve this?
To add a notification on a background thread you have to manually run a run loop on that thread and add the notification from within a callout from that run loop:
class Stuff {
var token: NotificationToken? = nil
var notificationRunLoop: CFRunLoop? = nil
func initNotificationToken() {
DispatchQueue.global(qos: .background).async {
// Capture a reference to the runloop so that we can stop running it later
notificationRunLoop = CFRunLoopGetCurrent()
CFRunLoopPerformBlock(notificationRunLoop, CFRunLoopMode.defaultMode.rawValue) {
let realm = try! Realm()
results = self.getRealmResults()
// Add the notification from within a block executed by the
// runloop so that Realm can verify that there is actually a
// runloop running on the current thread
token = results.addNotificationBlock { [weak self] (changes: RealmCollectionChange) in
// ...
}
}
// Run the runloop on this thread until we tell it to stop
CFRunLoopRun()
}
}
deinit {
token?.stop()
if let runloop = notificationRunLoop {
CFRunLoopStop(runloop)
}
}
}
GCD does not use a run loop on its worker threads, so anything based on dispatching blocks to the current thread's run loop (such as Realm's notifications) will never get called. To avoid having notifications silently fail to do anything Realm tries to check for this, which unfortunately requires the awakward PerformBlock dance.

How to runlong running tasks in iOS which are dependent on each other

I want to run some tasks which are dependent on each other so should be performed in an order. Currently, it blocks my UI thread and also there is some issue in ordering.
Couple of questions regarding this:
Tasks are not performed in correct order. What change to be made if we want them to be performed one after other
Is the code optimised in terms of memory usage and resource consumption? How it can be made more optimised?
Do we need global queues inside function call also as shown in code below?
Here are my code details. I have created some serial queues as follows:
var Q0_sendDisplayName=dispatch_queue_create("Q0_sendDisplayName",DISPATCH_QUEUE_SERIAL)
var Q1_fetchFromDevice=dispatch_queue_create("fetchFromDevice",DISPATCH_QUEUE_SERIAL)
var Q2_sendPhonesToServer=dispatch_queue_create("sendPhonesToServer",DISPATCH_QUEUE_SERIAL)
I have an idea that serial queues perform tasks in order so i have called my tasks on serial queues. Here is my code:
dispatch_sync(Q0_sendDisplayName,
{
self.sendNameToServer(displayName){ (result) -> () in
dispatch_sync(self.Q1_fetchFromDevice,
{
self.SyncfetchContacts({ (result) -> () in
dispatch_sync(self.Q2_sendPhonesToServer,
{ self.SyncSendPhoneNumbersToServer(self.syncPhonesList, completion: { (result) in
//.......
//....
The code inside these functions is also running on global queue. Dont know if it is a correct way to code. I have used completion handlers to notify that method has completed executing. Here is the code of function1:
func sendNameToServer(var displayName:String,completion:(result:Bool)->())
{
Alamofire.request(.POST,"\(urlToSendDisplayName)",headers:header,parameters:["display_name":displayName]).responseJSON{
response in
switch response.result {
case .Success:
return completion(result: true) //......
Here is the code of function2. This function is long as it reads whole contacts book so i have placed it inside global queue(dont know if it is right way). I call completion handler on main queue. Here is code:
func SyncfetchContacts(completion:(result:Bool)->())
{
let contactStore = CNContactStore()
var keys = [CNContactGivenNameKey, CNContactFamilyNameKey, CNContactEmailAddressesKey, CNContactPhoneNumbersKey, CNContactImageDataAvailableKey,CNContactThumbnailImageDataKey, CNContactImageDataKey]
dispatch_sync(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0)){
do {
try contactStore.enumerateContactsWithFetchRequest(CNContactFetchRequest(keysToFetch: keys)) { (contact, pointer) -> Void in
if (contact.isKeyAvailable(CNContactPhoneNumbersKey)) {
for phoneNumber:CNLabeledValue in contact.phoneNumbers {
let a = phoneNumber.value as! CNPhoneNumber
}
}
}
dispatch_async(dispatch_get_main_queue())
{
completion(result: true)
}
}
//........
Here is the code for function3 which again inside has a global queue(dont know if its right) and calls completion handler on main queue.
func SyncSendPhoneNumbersToServer(phones:[String],completion: (result:Bool)->()){
Alamofire.request(.POST,"\(url)",headers:header,parameters:["display_name":displayName]).responseJSON{
response in
switch response.result {
case .Success:
dispatch_sync(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,0))
{
//enter some large data in database in a loop
dispatch_async(dispatch_get_main_queue())
{
return completion(result: true)
}
}//......
In SyncfetchContacts you are calling the completion handler before contactStore.enumerateContactsWithFetchRequest has finished, outside of its completion closure.
Simply move it in there:
func SyncfetchContacts(completion:(result:Bool)->()) {
...
do {
try contactStore.enumerateContactsWithFetchRequest(CNContactFetchRequest(keysToFetch: keys)) { (contact, pointer) -> Void in
if (contact.isKeyAvailable(CNContactPhoneNumbersKey)) {
for phoneNumber:CNLabeledValue in contact.phoneNumbers {
let a = phoneNumber.value as! CNPhoneNumber
}
}
// here ...
dispatch_async(dispatch_get_main_queue()) {
completion(result: true)
}
}
// ... not here.
// dispatch_async(dispatch_get_main_queue()) {
// completion(result: true)
// }
}
}

Resources