I am reading a iOS book which says that "dispatch_sync function blocks the concurrent queue to which the block is submitted i.e it makes the queue wait". Based on that concept I created my own example which is as follows.The following snippet is written in the "viewDidLoad" method
dispatch_queue_t concQueue1 = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
void (^secondBlock)(void) =^( void)
{
//Second Block
for (int count=0; count<1000; count++)
{
if( [NSThread currentThread] == [NSThread mainThread ] )
{
NSLog(#"2 block main Thread") ;
}
else
{
NSLog(#"2 block other THREAD") ;
}
}
};
void (^firstBlock)(void) =^( void)
{
//First Block
for (int count=0; count<100; count++)
{
if( [NSThread currentThread] == [NSThread mainThread ] )
{
NSLog(#"1 block main Thread") ;
}
else
{
NSLog(#"1 block other THREAD") ;
}
}
dispatch_sync(concQueue1, secondBlock) ;
};
dispatch_async(concQueue1,firstBlock);
//Making the main thread sleep for some time
[NSThread sleepForTimeInterval:0.1] ;
dispatch_async( concQueue1, ^(void) {
//Third Block
for (int count=0; count<1000; count++)
{
if( [NSThread currentThread] == [NSThread mainThread ] )
{
NSLog(#"3 block main Thread") ;
}
else
{
NSLog(#"3 block other THREAD") ;
}
}
});
I am making the main thread sleep for some time so that dispatch_sync function in the "first block " gets executed. The output i am getting is this. I am showing the part of the output.
GCDExamples[2459:554259] 2 block other THREAD
.
.
.
GCDExamples[2459:554259] 2 block other THREAD
GCDExamples[2459:554256] 3 block other THREAD
GCDExamples[2459:554256] 3 block other THREAD
GCDExamples[2459:554259] 2 block other THREAD //Point first
GCDExamples[2459:554256] 3 block other THREAD
Some points about the output : The output shown in "3 block other THREAD" and "2 block other THREAD" are the first occurences of that output lines
MY QUESTION:
According to the concept because of dispatch_sync function once the second block starts it should make the queue wait rather than allowing "Third block" to start. But as shown in the earlier output "2 block other THREAD" follows "3 block other THREAD" statement at "//Point first" .This shows that the dispatch_sync function did'nt make the queue wait. How is that possible?? .Please ask me any another other infomration if needed.
EDIT 1 : I am putting the text of the well known book here to explain my point . The book is "iOS 7 Programming cookbook". Text follows:-
"For any task that doesn’t involve the UI, you can use global concurrent queues in GCD. These allow either synchronous or asynchronous execution. But synchronous execution does not mean your program waits for the code to finish before continuing. It simply means that the concurrent queue will wait until your task has finished before it continues to the next block of code on the queue. When you put a block object on a concurrent queue, your own program always continues right away without waiting for the queue to execute the code. This is because concurrent queues, as their name implies, run their code on threads other than the main thread."
As the bold text says that the concurrent queue would wait UNTIL my task is finished before continuing with the next block. My block printing "2 block other THREAD" should be allowed to finish before "3 block other THREAD" starts, but that is not the case my " 2 block other THREAD" is printed again intermingling with "3 block other THREAD" statement when in fact my all "2 block other THREAD" should be allowed to get completed and then "3 block other THREAD" should follow. Comment if more info is required.
"dispatch_sync function blocks the concurrent queue to which the block is submitted i.e it makes the queue wait"
If that's what the book says, throw it away. That's just wrong.
Synchronous vs. asynchronous is about the caller. When some code calls dispatch_sync(), that code can not proceed until the task being dispatched has completed. The queue is not blocked or forced to wait or anything like that.
By contrast, when code calls dispatch_async(), the task is put on the queue and the caller proceeds to its next step. It does not wait for the task that was dispatched to start, let alone finish.
That's a completely separate issue from whether a queue is concurrent or serial. That distinction belongs to the queues and the tasks they run, but doesn't directly affect the caller. A serial queue will only run one task at a time. If other tasks have been queued, they wait and run in strict sequence.
A concurrent queue can allow multiple tasks to run at the same time, depending on available system resources.
Update in response to edited question with new quote from the book:
For any task that doesn’t involve the UI, you can use global concurrent queues in GCD. These allow either synchronous or asynchronous execution. But synchronous execution does not mean your program waits for the code to finish before continuing. It simply means that the concurrent queue will wait until your task has finished before it continues to the next block of code on the queue. When you put a block object on a concurrent queue, your own program always continues right away without waiting for the queue to execute the code. This is because concurrent queues, as their name implies, run their code on threads other than the main thread.
This continues to be completely wrong. To take it part by part:
But synchronous execution does not mean your program waits for the code to finish before continuing.
That's exactly what "synchronous execution" does mean. When you submit a task synchronously, the submitting thread waits for the code to finish before continuing.
It simply means that the concurrent queue will wait until your task has finished before it continues to the next block of code on the queue.
No. The whole point of concurrent queues is that they don't wait for one task that is running before starting subsequent tasks. That's what "concurrent" means. A concurrent queue can run multiple tasks concurrently, at the same time, simultaneously.
When you put a block object on a concurrent queue, your own program always continues right away without waiting for the queue to execute the code.
No, this is wrong. It completely depends on what function you use to put that block on the queue. If it uses dispatch_sync(), it waits for the queue to execute the block. If it uses dispatch_async(), it does not wait; it continues right away. This is true whether the queue is serial or concurrent.
This is because concurrent queues, as their name implies, run their code on threads other than the main thread.
Any queue, serial or concurrent, other than the main queue, may run the blocks submitted to it on a background thread. And, if you use dispatch_sync() to submit a block to a concurrent queue from the main thread, it's very possible that the block will execute on the main thread. That's because GCD knows that the main thread isn't doing anything else, because it's blocked inside of the dispatch_sync() call, so it might as well run the block there.
In other words, the type of queue does not dictate which thread the block runs on.
The author of this book simply doesn't know what s/he is talking about.
First to understand, GCD you have to understand the difference between synchronous and asynchronous execution.
Synchronous = Executes as they are submitted and blocks the thread/queue they are submitted on.This means that:
The block (of code) only executes when it is it's turn in the queue.
The block (of code) will block the queue for executing and other blocks(of code & synchronous) will wait.
Basically the blocks will execute in a FIFO format.
Asynchronous = Starts executing immediately regardless of the queue/thread and does not block queue/thread.Executes even if something is executing on the queue(both synchronous and asynchronous).
To understand what when wrong, we will work through the code.
Lines 1-19 - Defined secondBlock
Lines 23-43 - Defined firstBlock
Line 47 - dispatch_async() firstBlock (Remember:Asynchronous execution)
Line 41[firstBlock] dispatch_sync() secondBlock (Remember: Synchronous execution)
Line 43[firstBlock] - firstBlock exits
Line 50 - Thread sleeps for 0.1 seconds
Line 52 - Define and Execute thirdBlock (Remember: Asynchronous execution).
The thirdBlock was executing asynchronously and started executing even if there is secondBlock executing on the queue.To achieve queuing of blocks(of code), us dispatch_sync().
Note
This functions operate relative to the concurrent queue.This means that dispatch_sync() will only be synchronous to the current queue. On other threads (such as the main thread) it appears to be asynchronous.
Related
I'm quite confused.
Code below will cause a deadlock for sure:
// Will execute
DispatchQueue.main.async { // Block 1
// Will execute
DispatchQueue.main.sync { // Block 2
// Will not be executed
}
// Will not be executed
}
Because
After we dispatch_async on main queue, it's submits the first block to the main queue to execute
At some moment, the system decides to run block1
The .sync method blocks "thread / queue?" <- My question
Because the "thread / queue" is blocked, block 2 can't execute before block 1 finishes, because Main queue is a serial queue, it execute task serially, one can't execute before another finishes
My question is: Does sync block current thread it's executing on or current queue? (I understand the difference between thread & queue)
Most answers on the internet says it blocks thread
If block thread -> How come the sync { } block can still execute since the thread is blocked?
If block queue -> Make more sense? Since the queue is blocked, we can't execute one before other finishes
I found some discussions about this:
dispatch_sync inside dispatch_sync causes deadlock
Difference Between DispatchQueue.sync vs DispatchQueue.async
What happens if dispatch on same queue?
You asked:
My question is: Does sync block current thread it's executing on or current queue?
It blocks the current thread.
When dealing with a serial queue (such as the main queue), if that queue is running something whose thread is blocked, that prevents anything else from running on that queue until the queue is free again. A serial queue can only use one thread at a time. Thus, dispatching synchronously from any serial queue to itself will result in a deadlock.
But, sync does not technically block the queue. It blocks the current thread. Notably, when dealing with a concurrent queue (such as a global queue or a custom concurrent queue), that queue can avail itself of multiple worker threads at the same time. So just because one worker thread is blocked, it will not prevent the concurrent queue from running another dispatched item on another, unblocked, worker thread. Thus, dispatching synchronously from a concurrent queue to itself will not generally deadlock (as long as you don’t exhaust the very limited worker thread pool).
E.g.
let serialQueue = DispatchQueue(label: "serial")
serialQueue.async {
serialQueue.sync {
// will never get here; deadlock
print("never get here")
}
// will never get here either, because of the above deadlock
}
let concurrentQueue = DispatchQueue(label: "concurrent", attributes: .concurrent)
concurrentQueue.async {
concurrentQueue.sync {
// will get here as long as you don't exhaust the 64 worker threads in the relevant QoS thread pool
print("ok")
}
// will get here
}
You asked:
If block thread -> How come the sync { } block can still execute since the thread is blocked?
As you have pointed out in your own code snippet, the sync block does not execute (in the serial queue scenario).
I have some doubts regarding GCD.
Code snippet 1
serialQ.sync {
print(1)
serialQ.async {
print(2)
}
serialQ.async {
print(3)
}
}
Code snippet 2
serialQ.async {
print(1)
serialQ.async {
print(2)
}
serialQ.sync {
print(3)
}
}
I ran both of them in playground, and found that Code snippet 2 gives deadlock while Code snippet 1 runs fine. I have read a lot about GCD and started playing around with these concepts. Can anyone please provide a detailed explanation for the same ?
PS : serialQ is a serial Queue
According to my understanding,
Serial Queue - generates only one thread at a time, and once that thread is freed up then it is occupied or free to do other tasks
Serial Queue dispatched sync - blocks the caller thread from which the serial queue is dispatched and performs the tasks on that thread.
Serial Queue dispatched async - does'nt not blocks the caller thread, infact it runs on a different thread and keeps the caller
thread running.
But for the above query I am not able to get the proper explanation.
You are calling sync inside a block already executing on the same queue. This will always cause a deadlock. sync is the equivalent of saying “execute this now and wait for it to return.” Since you are already executing on that queue, the queue never becomes available to execute the sync block. It sounds like you’re looking for recursive locks, but that’s not how queues work. It’s also quite arguably an anti-pattern in general. I discuss this more in this answer: How to implement a reentrant locking mechanism in objective-c through GCD?
EDIT: Came back to add some thoughts on your "understandings":
Serial Queue - generates only one thread at a time, and once that thread is freed up then it is occupied or free to do other tasks
A serial queue doesn't "generate" one thread. Queues and threads are different things and have different semantics. A serial queue requires one thread upon which to execute a work item, but there's not a one-to-one relationship between a serial queue and a thread. A thread is a relatively "heavy" resource and a queue is a relatively "light" resource. A single serial queue can execute work items on more than one thread over its lifetime (although never more than one thread at the same time). GCD maintains pools of threads that it uses to execute work items, but that is an implementation detail, and it's not necessary to understand how that's implemented in order to use queues properly.
Serial Queue dispatched sync - blocks the caller thread from which the serial queue is dispatched and performs the tasks on that thread.
A queue (serial or concurrent) is not "dispatched" (sync or otherwise). A work item is, well, enqueued into a queue. That work item will be subsequently executed by an arbitrary thread including, quite possibly, the calling thread. The guarantee is that only one work item enqueued to a given serial queue will be executing (on any thread) at one time.
Serial Queue dispatched async - doesn't block the ~caller~ enqueueing thread, in fact it runs on a different thread and keeps the caller thread running. (minor edits for readability)
This is close, but not quite accurate. It's true that enqueueing a work item onto a serial queue with async does not block the enqueueing thread. It's not necessarily true that the work item is executed by a different thread than the enqueueing thread, although in the common case, that is usually the case.
The thing to know here is that the difference between sync and async is strictly limited to the behavior of the enqueueing thread and has no (guaranteed) bearing or impact on which thread the work item is executed. If you enqueue a work item with sync the enqueueing thread will wait (possibly forever, in the specific case you outlined here) for the work item to complete, whereas if you enqueue a work item with async the enqueueing thread will continue executing.
A sync call, as you point out, blocks the current thread until the block runs. So when you make sync to the same serial queue that you’re currently on, you are blocking the queue, waiting for a block to run on the same queue that you just blocked, resulting in a deadlock.
If you really want to run something synchronously on the current queue, don’t dispatch it with sync at all and just run it directly. E.g.:
serialQ.async {
print(1)
serialQ.async {
print(2)
}
// serialQ.sync { // don't dispatch synchronously to the current serial queue
print(3)
// }
}
Or dispatch asynchronously. E.g.,
serialQ.async {
print(1)
serialQ.async {
print(2)
}
serialQ.async {
print(3)
}
}
Or use a concurrent queue (in which case you have to be careful to make sure you don’t have thread explosion, which could result in deadlock, too). E.g.,
let concurrentQ = DispatchQueue(label: "...", attributes: .concurrent)
concurrentQ.async {
print(1)
concurrentQ.async {
print(2)
}
concurrentQ.sync {
print(3)
}
}
so i know we should never call dispatch_sync on main queue, as its a serial queue.
so the below code will crash in xcode:
DispatchQueue.main.sync {
print("hello world")
}
but i am not able to understand 100% why it is going to crash or deadlock ?
can someone explain with some drawing because i don't have 100% proof explanation that why it is going to crash or deadlock.
i know main queue is serial queue. so it executes tasks one by one. suppose we have task A running on main queue and if i add another taskB on main queue then task B only starts after taskA finishes.
and i also know that main queue should only perform UI specific tasks.
but its just i don't have 100% proof explanation of why above code will crash / deadlock.
can someone explain with some kind of simple drawing.
i know main queue is serial queue. so it executes tasks one by one. suppose we have task A running on main queue and if i add another taskB on main queue then task B only starts after taskA finishes.
Correct
and i also know that main queue should only perform UI specific tasks.
Not correct. The UI must only be updated on the main queue but you can do whatever you want on the main queue. You can handle data on the main queue if it fits that particular application. If you're handling a lot of data and don't want to freeze the UI, then you would want to get off the main queue.
but its just i don't have 100% proof explanation of why above code will crash / deadlock.
// doing work...
DispatchQueue.main.sync { // stop the main thread and wait for the following to finish
print("hello world") // this will never execute on the main thread because we just stopped it
}
// deadlock
Please explain to me why I am getting this crash?
Thread 1: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)
in this
DispatchQueue.main.sync {
print("sync")
}
This is my code.
override func viewDidLoad() {
super.viewDidLoad()
print("Start")
DispatchQueue.main.async {
print("async")
}
DispatchQueue.main.sync {
print("sync")
}
print("Finish")
}
NEVER call the sync function on the main queue
If you call the sync function on the main queue it will block the queue as well as the queue will be waiting for the task to be completed but the task will never be finished since it will not be even able to start due to the queue is already blocked. It is called deadlock.
Two (or sometimes more) items — in most cases, threads — are said to be deadlocked if they all get stuck waiting for each other to complete or perform another action. The first can’t finish because it’s waiting for the second to finish. But the second can’t finish because it’s waiting for the first to finish.
You need to be careful though. Imagine if you call sync and target the current queue you’re already running on. This will result in a deadlock situation.
Use sync to keep track of your work with dispatch barriers, or when you need to wait for the operation to finish before you can use the data processed by the closure.
When to use sync?
When we need to wait until the task is finished. F.e. when we are making sure that some function/method is not double called. F.e. we have synchronization and trying to prevent it to be double called until it's completely finished.
When you need to wait for something done on a DIFFERENT queue and only then continue working on your current queue
Synchronous vs. Asynchronous
With GCD, you can dispatch a task either synchronously or asynchronously.
A synchronous function returns control to the caller after the task is completed.
An asynchronous function returns immediately, ordering the task to be done but not waiting for it. Thus, an asynchronous function does not block the current thread of execution from proceeding on to the next function.
#sankalap, Dispatch.main is a serial queue which has single thread to execute all the operations. If we call "sync" on this queue it will block all other operations currently running on the thread and try to execute the code block inside sync whatever you have written. This results in "deadlock".
As per Apple documentation on executing dispatch_sync on a queue you're currently on will crash your code:
Calling this function and targeting the current queue results in
deadlock.
Because the current queue is the main queue, when you continue to call sync on the main queue, the system will understand that current main queue must wait some code complete in current queue, but no code at current queue (main queue), so you wait forever:
Apple document: Calling this function and targeting the current queue results in deadlock.
This code is blocking on the call to dispatch_sync. I'm new to dispatch queues. Any reason why this would block?
NSLog(#"%#",dispatch_get_current_queue());
NSLog(#"%#",dispatch_get_main_queue());
if (dispatch_get_current_queue() == dispatch_get_main_queue())
{
block();
}
else
dispatch_sync(dispatch_get_main_queue(),block);
The logs print these queues
OS_dispatch_queue_root: com.apple.root.low-priority[0x345bbc0]
OS_dispatch_queue: com.apple.main-thread[0x345b900]>
Comparing the current queue against the main queue is not a valid way to check whether you are running on the main thread.
From the GCD docs - http://developer.apple.com/library/mac/#DOCUMENTATION/Darwin/Reference/ManPages/man3/dispatch_get_main_queue.3.html
The result of dispatch_get_main_queue() may or may not equal the
result of dispatch_get_current_queue()
when called on the main thread. Comparing the two is not a valid way to test whether code is executing
on the main thread. Foundation/AppKit programs should use [NSThread isMainThread]. POSIX programs may
use pthread_main_np(3).
dispatch_get_current_queue() is deprecated and it was meant for debugging only, so you shouldn't use it in production code.
There's no need to check if this queue is the main queue, just dispatch_sync it on the main queue and it'll get there.