How to stop/cancel/suspend/resume tasks on GCD queue
How does one stop background queue operations? I want to stop some screens in our app. And some screens it should be auto resume. So, how does one pass a queue in iOS?
I mean when user have browsing the app time we run the background thread in dispatch_queue_t. But it never stops and resume in the code. So how does one suspend and resume a queue
To suspend a dispatch queue, it is simply queue.suspend() (dispatch_suspend(queue) in Objective-C). That doesn't affect any tasks currently running, but merely prevents new tasks from starting on that queue. Also, you obviously only suspend queues that you created (not global queues, not main queue).
To resume a dispatch queue, it is queue.resume() (or dispatch_resume(queue) in Objective-C). There's no concept of “auto resume”, so you'd just have to manually resume it when appropriate.
To pass a dispatch queue around, you simply pass the DispatchQueue object that you created (or the dispatch_queue_t object that you created when you called dispatch_queue_create() in Objective-C).
In terms of canceling tasks queued on dispatch queues, this is a was introduced in iOS 8. One can item.cancel() a DispatchWorkItem (dispatch_block_cancel(block) a dispatch_block_t object in Objective-C). This cancels queued blocks/items that have not started, but does not stop ones that are underway. If you want to be able to interrupt a dispatched block/item, you have to periodically examine item.isCancelled (or dispatch_block_testcancel() in Objective-C).
See https://stackoverflow.com/a/38372384/1271826 for examples on canceling dispatch work items.
If you want to cancel tasks, you might also consider using operation queues, i.e. OperationQueue (NSOperationQueue in Objective-C). Its cancelable operations have been around for a while and you're likely to find lots of examples online. It also supports constraining the degree of concurrency with maxConcurrentOperationCount (whereas with dispatch queues you can only choose between serial and concurrent, and controlling concurrency more than that requires a tiny bit of effort on your part).
If using operation queues, you suspend and resume by changing the suspended property of the queue. And to pass it around, you just pass the NSOperationQueue object you instantiated.
Having said all of that, I'd suggest you expand your question to elaborate what sort of tasks are running in the background and articulate why you want to suspend them. There might be better approaches than suspending the background queue.
In your comments, you mention that you were using NSTimer, a.k.a. Timer in Swift. If you want to stop a timer, call timer.invalidate() to stop it. Create a new NSTimer when you want to start it again.
Or if the timer is really running on a background thread, GCD “dispatch source timers” do this far more gracefully. With a GCD timer, you can suspend/resume it just like you suspend/resume a queue, just using the timer object instead of the queue object.
You can't pause / cancel when using a GCD queue. If you need that functionality (and in a lot of general cases even if you don't) you should be using the higher level API - NSOperationQueue. This is built on top of GCD but it gives you the ability to control how many things are executing at the same time, suspend processing of the queue and to cancel individual / all operations.
Related
I have been using DispatchQueue.main.async for a long time to perform UI related operations.
Swift provides both DispatchQueue.main.async and DispatchQueue.main.sync, and both are performed on the main queue.
Can anyone tell me the difference between them?
When should I use each?
DispatchQueue.main.async {
self.imageView.image = imageView
self.lbltitle.text = ""
}
DispatchQueue.main.sync {
self.imageView.image = imageView
self.lbltitle.text = ""
}
Why Concurrency?
As soon as you add heavy tasks to your app like data loading it slows your UI work down or even freezes it.
Concurrency lets you perform 2 or more tasks “simultaneously”.
The disadvantage of this approach is that thread safety which is not always as easy to control. F.e. when different tasks want to access the same resources like trying to change the same variable on a different threads or accessing the resources already blocked by the different threads.
There are a few abstractions we need to be aware of.
Queues.
Synchronous/Asynchronous task performance.
Priorities.
Common troubles.
Queues
Must be serial or concurrent. As well as global or private at the same time.
With serial queues, tasks will be finished one by one while with concurrent queues, tasks will be performed simultaneously and will be finished on unexpected schedules. The same group of tasks will take the way more time on a serial queue compared to a concurrent queue.
You can create your own private queues (both serial or concurrent) or use already available global (system) queues.
The main queue is the only serial queue out of all of the global queues.
It is highly recommended to not perform heavy tasks which are not referred to UI work on the main queue (f.e. loading data from the network), but instead to do them on the other queues to keep the UI unfrozen and responsive to the user actions. If we let the UI be changed on the other queues, the changes can be made on a different and unexpected schedule and speed. Some UI elements can be drawn before or after they are needed. It can crash the UI. We also need to keep in mind that since the global queues are system queues there are some other tasks can run by the system on them.
Quality of Service / Priority
Queues also have different qos (Quality of Service) which sets the task performing priority (from highest to lowest here):
.userInteractive - as for the main queue .userInitiated - for the user initiated tasks on which user waits for some response .utility - for the tasks which takes some time and doesn't require immediate response, e.g working with data .background - for the tasks which aren't related with the visual part and which aren't strict for the completion time). There is also .default queue which does't transfer the qos information.
If it wasn't possible to detect the qos the qos will be used between .userInitiated and .utility.
Tasks can be performed synchronously or asynchronously.
Synchronous function returns control to the current queue only after the task is finished. It blocks the queue and waits until the task is finished.
Asynchronous function returns control to the current queue right after task has been sent to be performed on the different queue. It doesn't wait until the task is finished. It doesn't block the queue.
Common Troubles.
The most popular mistakes programmers make while projecting the concurrent apps are the following:
Race condition - caused when the app work depends on the order of the code parts execution.
Priority inversion - when the higher priority tasks wait for the smaller priority tasks to be finished due to some resources being blocked
Deadlock - when a few queues have infinite wait for the sources (variables, data etc.) already blocked by some of these queues.
NEVER call the sync function on the main queue.
If you call the sync function on the main queue it will block the queue as well as the queue will be waiting for the task to be completed but the task will never be finished since it will not be even able to start due to the queue is already blocked. It is called deadlock.
When to use sync?
When we need to wait until the task is finished. F.e. when we are making sure that some function/method is not double called. F.e. we have synchronization and trying to prevent it to be double called until it's completely finished. Here's some code for this concern: How to find out what caused error crash report on IOS device?
When you use async it lets the calling queue move on without waiting until the dispatched block is executed. On the contrary sync will make the calling queue stop and wait until the work you've dispatched in the block is done. Therefore sync is subject to lead to deadlocks. Try running DispatchQueue.main.sync from the main queue and the app will freeze because the calling queue will wait until the dispatched block is over but it won't be even able to start (because the queue is stopped and waiting)
When to use sync? When you need to wait for something done on a DIFFERENT queue and only then continue working on your current queue
Example of using sync:
On a serial queue you could use sync as a mutex in order to make sure that only one thread is able to perform the protected piece of code at the same time.
DispatchQueue.<>.sync vs DispatchQueue.<>.async
[Sync vs Async]
[GCD]
GCD allows you to execute a task synchronously or asynchronously
synchronous(block and wait) function returns a control when the task will be completed
asynchronous(dispatch and proceed) function returns a control immediately, dispatching the task to start to an appropriate queue but not waiting for it to complete.
sync or async methods have no effect on the queue on which they are called.
sync will block the thread from which it is called and not the queue on which it is called. It is the property of DispatchQueue which decides whether the DispatchQueue will wait for the task execution (serial queue) or can run the next task before current task gets finished (concurrent queue).
So even when DispatchQueue.main.async is an async call, a heavy duty operation added in it can freeze the UI as its operations are serially executed on the main thread. If this method is called from the background thread, control will return to that thread instantaneously even when UI seems to be frozen. This is because async call is made on DispatchQueue.main
I am using Grand Central Dispatch to run a process in background. I want know how can i suspend, resume and stop that background thread. I have tried
dispatch_suspend(background_thread);
dispatch_resume(background_thread);
but these functions doesn't help me, it keeps on running. Please someone help me.
You seem to have some confusion. Direct manipulation of threads is not part of the GCD API. The GCD object you normally manipulate is a queue, not a thread. You put blocks in a queue, and GCD runs those blocks on any thread it wants.1
Furthermore, the dispatch_suspend man page says this:
The dispatch framework always checks the suspension status before executing a block, but such changes never affect a block during execution (non-preemptive).
In other words, GCD will not suspend a queue while the queue is running a block. It will only suspend a queue while the queue is in between blocks.
I'm not aware of any public API that lets you stop a thread without cooperation from the function running on that thread (for example by setting a flag that is checked periodically on that thread).
If possible, you should break up your long-running computation so that you can work on it incrementally in a succession of blocks. Then you can suspend the queue that runs those blocks.
Footnote 1. Except the main queue. If you put a block on the main queue, GCD will only run that block on the main thread.
You are describing a concurrent processing model, where different processes can be suspended and resumed. This is often achieved using threads, or in some cases coroutines.
GCD uses a different model, one of partially ordered blocks where each block is sequentially executed without pre-emption, suspension or resumption directly supported.
GCD semaphores do exist, and may suit your needs, however creating general cooperating concurrent threads with them is not the goal of GCD. Otherwise look at a thread based solution using NSThread or even Posix threads.
Take a look at Apple's Migrating Away from Threads to see if your model is suited to migration to GCD, but not all models are.
I'm loving NSOperationQueue but I'm having some issues understanding some portions of it.
In the second issue of objc.io they go over NSOperationQueue and mention that it has two kinds of queues, the main queue which runs on the main thread, and the background queues. They mention you can access the main queue with [NSOperation mainQueue] and then add manipulate it.
You normally would not want to do this, correct? If it's running on the main thread, will it not block the main thread for other tasks? Wouldn't it not run concurrently with other tasks?
It also mentions you can add to the background queues (which I understand would be better?) by creating instances of NSOperation (subclasses potentially).
Do I save a reference to the NSOperationQueue that I create for operations in order to have for creating more operations? I assume there's no singleton for background queues like there is for mainQueue, so how do I manage adding tasks to background queues?
It also mentions you can control the amount of operations running concurrently with the maxConcurrentOperationCount property.
I know normally you set it to NSOperationQueueDefaultMaxConcurrentOperationCount, but if I set it to a specific number manually, does it correspond to the maximum number of threads that can be run at once? For example if the processor on the iPhone can run 4 threads at once and I set that property to 8, what happens?
You ask:
You normally would not want [to add operations to the mainQueue], correct? If it's running on the main thread, will it not block the main thread for other tasks? Wouldn't it not run concurrently with other tasks?
Yes, you would never want to add anything slow to the main queue. But that doesn't mean that you don't use the main queue. For some operations (e.g. UI updates) it's critical.
The typical pattern is to create a operation queue for tasks that you want to run in the background, but if you subsequently need to do something that needs to run on the main queue (e.g. UI updates, updating the model, etc.), you would go ahead and do that, for example:
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[queue addOperationWithBlock:^{
// do some time consuming stuff in the background
// when done, you might update the UI in the main queue
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
// update the UI here
}];
];
You ask:
Do I save a reference to the NSOperationQueue that I create for operations in order to have for creating more operations? I assume there's no singleton for background queues like there is for mainQueue, so how do I manage adding tasks to background queues?
Yes, if you want to add more operations to that same queue later, yes, you want to maintain a reference to that queue. You can do this by adding it to the app delegate, some central view controller, or a singleton.
But yes, there's no built-in singleton for background queues (because you can, conceivably have different queues for different operations, e.g. one for network operations, one for image processing, etc.). But you can write your own singleton for each queue of each type, if you want.
You also ask:
I know normally you set it to NSOperationQueueDefaultMaxConcurrentOperationCount, but if I set it to a specific number manually, does it correspond to the maximum number of threads that can be run at once? For example if the processor on the iPhone can run 4 threads at once and I set that property to 8, what happens?
One should set maxConcurrentOperationCount to be whatever you think is appropriate for the type of queue. For network operation queue, you generally wouldn't exceed 4, but for other types of queues, you might easily have more. I believe that there is a maximum of 64 worker threads (which concurrent queues avail themselves as they need threads).
If you attempt to use more than that, the app won't start your operation until a worker thread becomes available. Apple advises, though, that one refrain from using up all of the worker threads. So use a reasonable number appropriate for your queue's function. Frankly, one of the advantages of operation queues over dispatch queues is that you can constrain the maximum number of worker threads that will be used at any given time to better manage the device's limited resources.
References
WWDC 2012 video Asynchronous Design Patterns with Blocks, GCD, and XPC is an excellent primer on some GCD patterns and touches upon the "too many threads" question.
The Building Concurrent User Interfaces on iOS video walks through some practical implications of building concurrent iOS apps.
The About Threaded Programming section of the Threading Programming Guide touches upon the relationship between cores and threads.
The Concurrency and Application Design section of the Concurrency Programming Guide is an articulate discussion of the relationships between threads and operation/dispatch queues.
Generally you don't want to use the main queue. Any operation there will run on the main thread.
When you create an operation queue, create it for a purpose. Like it will be used for all server requests. So you can control how many concurrent requests are in progress. So don't add algorithmic processing operations to this queue because they have a different purpose. Keep a reference to the queue so you can add operations in future (and pause / cancel operations).
There is no 'normal' setting for maxConcurrentOperationCount - it should be set based on the purpose.
If you set it to 8 then the queue will run up to 8 at the same time. This may not be the most efficient option. Always keep the purpose of the queue in mind.
First of all, you will have to keep in mind that you always separate the main thread with the background thread. Only the operations which involve updating the UI must be performed in the main thread and rest of the operations must be performed in the Background thread. e.g if you are dealing with the multiple downloads, then you have to take care of all the network based operations in the background queue, and you will have to perform the UI update in the main queue.
//e.g for updating UI in main thread.
[self performSelectorOnMainThread:#selector(updateUI) withObject:nil waitUntilDone:YES];
Also when you use set maxConcurrentOperationCount property as NSOperationQueueDefaultMaxConcurrentOperationCount, it means the operationQueue takes the number of concurrent operations depending on the system environment.
Useful Links:
http://mobile.tutsplus.com/tutorials/iphone/nsoperationqueue/
http://www.raywenderlich.com/19788/how-to-use-nsoperations-and-nsoperationqueues
http://www.cimgf.com/2008/02/16/cocoa-tutorial-nsoperation-and-nsoperationqueue/
When selecting which queue to run dispatch_async on, dispatch_get_global_queue is mentioned a lot. Is this one special background queue that delegates tasks to a certain thread? Is it almost a singleton?
So if I use that queue always for my dispatch_async calls, will that queue get full and have to wait for things to finish before another one can start, or can it assign other tasks to different threads?
I guess I'm a little confused because when I'm choosing the queue for an NSOperation, I can choose the queue for the main thread with [NSOperationQueue mainQueue], which seems synonymous to dispatch_get_main_queue but I was under the impression background queues for NSOperation had to be individually made instances of NSOperationQueue, yet GCD has a singleton for a background queue? (dispatch_get_global_queue)
Furthermore - silly question but wanted to make sure - if I put a task in a queue, the queue is assigned to one thread, right? If the task is big enough it won't split it up over multiple threads, will it?
When selecting which queue to run dispatch_async on,
dispatch_get_global_queue is mentioned a lot. Is this one special
background queue that delegates tasks to a certain thread?
A certain thread? No. dispatch_get_global_queue retrieves for you a global queue of the requested relative priority. All queues returned by dispatch_get_global_queue are concurrent, and may, at the system's discretion, dispatch work to many different threads. The mechanics of this are an implementation detail that is opaque to you as a consumer of the API.
In practice, and at the risk of oversimplifying it, there is one global queue for each priority level, and at the time of this writing, based on my experience, each of those will at any given time be dispatching work to between 0 and 64 threads.
Is it almost a singleton?
Strictly no, but you can think of them as singletons where there is one singleton per priority level.
So if I use that queue always for my dispatch_async calls, will that
queue get full and have to wait for things to finish before another
one can start, or can it assign other tasks to different threads?
It can get full. Practically speaking, if you are saturating one of the global concurrent queues (i.e. more than 64 background tasks of the same priority in flight at the same time), you probably have a bad design. (See this answer for more details on queue width limits)
I guess I'm a little confused because when I'm choosing the queue for
an NSOperation, I can choose the queue for the main thread with
[NSOperationQueue mainQueue], which seems synonymous to
dispatch_get_main_queue
They are not strictly synonymous. Although NSOperationQueue uses GCD under the hood, there are some important differences. For instance, in a single pass of the main run loop, only one operation enqueued to +[NSOperationQueue mainQueue] will be executed, whereas more than one block submitted to dispatch_get_main_queue might be executed on a single run loop pass. This probably doesn't matter to you, but they are not, strictly speaking, the same thing.
but I was under the impression background
queues for NSOperation had to be individually made instances of
NSOperationQueue, yet GCD has a singleton for a background queue?
(dispatch_get_global_queue)
In short, yes. It sounds like you're conflating GCD and NSOperationQueue. NSOperationQueue is not just a "trivial wrapper" around GCD, it's its own thing. The fact that it's implemented on top of GCD should not really matter to you. NSOperationQueue is a task queue, with an explicitly settable width, that you can create instances of "at will." You can make as many of them as you like. At some point, all instances of NSOperationQueue are, when executing NSOperations, pulling resources from the same pool of system resources as the rest of your process, including GCD, so yes, there are some interactions there, but they are opaque to you.
Furthermore - silly question but wanted to make sure - if I put a task
in a queue, the queue is assigned to one thread, right? If the task is
big enough it won't split it up over multiple threads, will it?
A single task can only ever be executed on a single thread. There's not some magical way that the system would have to "split" a monolithic task into subtasks. That's your job. With regard to your specific wording, the queue isn't "assigned to one thread", the task is. The next task from the queue to be executed might be executed on a completely different thread.
I am trying to re-schedule queued block that will handle the update operations.
Main goal is updating UI objects (online user table...) with minimum amount of (UI update request). (Server sometimes rain down massive amount of updates, yay!)
For simplicity main scenario is;
The dispatch_queue_t instance (queue that will handle given UI updating block) is a serial dispatch queue (private dispatch queue)
The operation (UI updating block) is scheduled with dispatch_after with t amount of time (Instead of updating for each data set update, collect update requests within t amount of time and perform a single UI update for them)
In case our data set updated, check if there already exist a scheduled event. If yes, unschedule it from dispatch_queue_t instance. Then re-schedule same block with t amount of time delay.
Also;
t is a small amount of time interval that possibly won't be noticed by the user (like 500 ms.)
Any alternative approach is welcome.
My motive behind this;
i applied same logic via Android's Handler (post & removeCallbacks combination with Runnable instance) and i hope i could achieve the same on iOS.
Edit:
As #Sven suggested usage of NSOperationQueue is more suitable for the scenario as they support cancelling each NSOperation. I skimmed through documents and found;
Canceling Operations
Once added to an operation queue, an operation object is effectively owned by the queue and cannot be removed. The only way to dequeue an operation is to cancel it. You can cancel a single individual operation object by calling its cancel method or you can cancel all of the operation objects in a queue by calling the cancelAllOperations method of the queue object.
You should cancel operations only when you are sure you no longer need them. Issuing a cancel command puts the operation object into the “canceled” state, which prevents it from ever being run. Because a canceled operation is still considered to be “finished”, objects that are dependent on it receive the appropriate KVO notifications to clear that dependency. Thus, it is more common to cancel all queued operations in response to some significant event, like the application quitting or the user specifically requesting the cancellation, rather than cancel operations selectively.
This can easily be done with GCD as well, no need to reach for the big hammer that is NSOperationQueue here.
Just use a non-repeating dispatch timer source directly instead of dispatch_after (which is just a convenience wrapper around such a timer source, it doesn't actually enqueue the block onto the queue until the timer goes off).
You can reschedule a pending timer source execution with dispatch_source_set_timer().
You cannot remove or otherwise change an operation enqueued on a dispatch queue. Try using the higher level NSOperationQueue instead which supports cancellation.