I need to create a serializable Dispatch queue.
I'm using AudioUnit to register from the microphone, and then encoding the frame on the created queue.
I have one object encoding, so I need always the same thread accessing it.
However, when I create my queue :
queue = dispatch_queue_create("com.myapp.queue", DISPATCH_QUEUE_SERIAL);
If I dispatch to this queue with :
NSLog(#"recording callback Thread Info: %#", [NSThread currentThread]);
dispatch_async(queue, ^{
[self processAudio];
});
- (void) processAudio
{
NSLog(#"processAudio Thread Info: %#", [NSThread currentThread]);
...
}
It is mostly using the same thread until one point where I get this :
recording callback Thread Info: <NSThread: 0x7fb0ab331b90>{number = 12, name = (null)}
processAudio Thread Info: <NSThread: 0x7fb0ab210140>{number = 7, name = (null)}
recording callback Thread Info: <NSThread: 0x7fb0ab331b90>{number = 12, name = (null)}
processAudio Thread Info: <NSThread: 0x7fb0a8e33950>{number = 8, name = (null)}
recording callback Thread Info: <NSThread: 0x7fb0ab331b90>{number = 12, name = (null)}
processAudio Thread Info: <NSThread: 0x7fb0ab212bf0>{number = 13, name = (null)}
The dispatch_async will switch thread, and then stick a while with thread number 13, until it will switch again.
Is that a normal behavior although I specified that I wanted a SERIAL queue ?
Should I be worried about switching thread, when using only one instance of an object, or is it really serialized ?
This is expected behavior and you should not worry about it. From Apple's docs (https://developer.apple.com/library/ios/documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html):
Serial queues (also known as private dispatch queues) execute one task at a time in the order in which they are added to the queue. The currently executing task runs on a distinct thread (which can vary from task to task) that is managed by the dispatch queue.
Related
Which thread is blocked by Swift's sleep: method?
let customConcurrentQueue = DispatchQueue(label: "CustomConcurrentQueue", attributes: .concurrent)
customConcurrentQueue.async {
sleep(5)
print("1")
}
print("2")
Will sleep method block the Main thread also?
No, the main thread will not be blocked.
It is because you are calling an async block on the customConcurrentQueue, which allows the main thread to continue running immediately. If you used sync instead, the main thread would wait until the sync block is finished running.
I have come across a very interesting problem related to queue dead lock in iOS. Any way to avoid this?
Consider this:
Create a custom serial queue.
Dispatch some task (#1) on this serial queue asynchronously.
This async task (#1) on dispatches some task (#2) onto main queue sync.
Main queue dispatches some task (#3) onto serial queue sync.
Result - DeadLock
Below is the sample code for this.
Since self.opQueue is a serial queue, task#3 will not start till task#1 completes.
Since task#1 is calling main queue sync, so it will never complete till main queue completes task#2.
Since main queue is waiting for opQueue to finish task#3, and opQueue is waiting for main queue to finish task#2 there is a deadlock.
#import "ViewController.h"
#interface ViewController ()
#property(nonatomic,strong) dispatch_queue_t opQueue;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
dispatch_queue_t queue = dispatch_queue_create("com.tarun.sqqueue",
DISPATCH_QUEUE_SERIAL);
self.opQueue = queue;
[self performOperations];
}
/// 1. Dispatch on serial queue async.
/// 2. This async task on serial queue dispatchs some task onto
/// main queue sync.
/// 3. Main queue dispatched some task onto serial queue sync.
/// 4. Result - DeadLock
- (void)performOperations {
/// task#1: Dispatch task on the serial Queue Asynchronously.
/// So this is not blocking.
dispatch_async(self.opQueue, ^{
for (int i = 1; i<=100; i++) {
NSLog(#"%d. Async on Serial Queue from Main Queue.",i);
}
/// task#2: Dispatching task on main queue synchronously from serial
/// queue.So this queue will wait till main queue executes this task.(Blocking)
dispatch_sync(dispatch_get_main_queue(), ^{
for (int i = 1; i<=100; i++) {
NSLog(#"%d. Sync on main queue from Serial Queue.",i);
}
});
});
/// task#3: Dispatching task on swrial queue synchronously from main
/// queue.So main queue will wait till serial queue executes this task. (Blocking)
dispatch_sync(self.opQueue, ^{
for (int i = 1; i<=100; i++) {
NSLog(#"%d. Sync on Serial Queue From Main Queue.",i);
}
});
/// Since self.opQueue is a serial queue, task#3 will not start till task#1 completes.
/// Since task#1 is calling main queue sync,
/// so it will never complete till main queue completes task#2.
/// Since main queue is waiting for opQueue to finish task#3, and opQueue is waiting for main queue to finish task#2 there is a deadlock.
NSLog(#"Back to main queue");
}
#end
From Apple
Important: You should never call the dispatch_sync or dispatch_sync_f
function from a task that is executing in the same queue that you are
planning to pass to the function. This is particularly important for
serial queues, which are guaranteed to deadlock, but should also be
avoided for concurrent queues.
Its a guaranteed dead lock. The only way to avoid this is to not implement it this way.
I met the problem is why the following code print out priority is the same? Why is this? Thank you very much.
print("main:\(Thread.current)")
print("thread main priorities:\(Thread.current.threadPriority)")
DispatchQueue.global(qos: .utility).async {
print("utility:\(Thread.current)")
print("thread utility priorities:\(Thread.current.threadPriority)")
}
DispatchQueue.global(qos: .background).async {
print("background:\(Thread.current)")
print("thread background priorities:\(Thread.current.threadPriority)")
}
DispatchQueue.global(qos: .userInteractive).async {
print("userInteractive:\(Thread.current)")
print("thread userInteractive priorities:\(Thread.current.threadPriority)")
}
DispatchQueue.global(qos: .userInitiated).async {
print("userInitiated:\(Thread.current)")
print("thread userInitiated priorities:\(Thread.current.threadPriority)")
}
Log:
main:{number = 1, name = main}
thread main priorities:0.5
userInteractive:{number = 3, name = (null)}
utility:{number = 5, name = (null)}
background:{number = 6, name = (null)}
thread userInteractive priorities:0.5
userInitiated:{number = 4, name = (null)}
thread utility priorities:0.5
thread background priorities:0.5
thread userInitiated priorities:0.5
As you can see in the docs:
#available(iOS 4.0, *)
open var threadPriority: Double // To be deprecated; use qualityOfService below
threadPriority is deprecated. Also, Apple's GCD team has been very clear on the Thread - Queue topic. GCD will managed the threads for you. A queue priority is not a thread priority. Take a look at this example, which is the code you posted here, pretty much, I just added the "qos_class_self().rawValue", which is really how you should test the QoS, and not with the "threadPriority".
print("main:\(Thread.current), Thread main priorities:\(Thread.current.threadPriority), QoS: \(qos_class_self().rawValue)")
DispatchQueue.global(qos: .utility).async {
print("utility:\(Thread.current), Thread utility priorities:\(Thread.current.threadPriority), QoS: \(qos_class_self().rawValue)")
}
DispatchQueue.global(qos: .background).async {
print("background:\(Thread.current), Thread background priorities:\(Thread.current.threadPriority), QoS: \(qos_class_self().rawValue)")
}
DispatchQueue.global(qos: .userInteractive).async {
print("userInteractive:\(Thread.current), Thread userInteractive priorities:\(Thread.current.threadPriority), QoS: \(qos_class_self().rawValue)")
}
DispatchQueue.global(qos: .userInitiated).async {
print("userInitiated:\(Thread.current), Thread userInitiated priorities:\(Thread.current.threadPriority), QoS: \(qos_class_self().rawValue)")
}
I also combined the prints per closure so they don't get scrambled in different lines in the log. This shows you that QoS is maintained, whereas threadPriority is deprecated.
main:<NSThread: 0x6100000762c0>{number = 1, name = main}, Thread main priorities:0.5, QoS: 33
userInteractive:<NSThread: 0x610000261fc0>{number = 4, name = (null)}, Thread userInteractive priorities:0.5, QoS: 33
utility:<NSThread: 0x618000079100>{number = 3, name = (null)}, Thread utility priorities:0.5, QoS: 17
userInitiated:<NSThread: 0x608000078380>{number = 5, name = (null)}, Thread userInitiated priorities:0.5, QoS: 25
background:<NSThread: 0x610000262000>{number = 6, name = (null)}, Thread background priorities:0.5, QoS: 9
So, in the end, the take away is not to think in terms of threads, but to think in terms of Queues, their QoS and knowing that GCD will manage threads for you. Very different concept to what we were used to before Queues came around with GCD.
the apple document says:(concurrencyProgrammingGuide,page49)
Important: You should never call the dispatch_sync or dispatch_sync_f function from a task that is executing in the same queue that you are planning to pass to the function. This is particularly important for serial queues, which are guaranteed to deadlock, but should also be avoided for concurrent queues.
but the code here not cause a deadlock, since i have ran it many times:
dispatch_queue_t concurrentQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(concurrentQueue, ^(){
NSLog(#"in outer queue: %#", [NSThread currentThread]);
dispatch_sync(concurrentQueue, ^(){
NSLog(#"do someting thread: %#", [NSThread currentThread]);
});
});
Yet,we all know,in main thread context, if we execute the code below,it will cause deadlock in main thread. so i am confused why calling dispatch_sync in the same thread, one not deadlock(the code above), the other opposite(the code below)?
dispatch_sync(dispatch_get_main_queue(), ^{
NSLog(#"________update__UI");
});
dispatch_get_global_queue() returns a system-defined global concurrent queue.
Serial Dispatch Queue (the main queue and user created queues with default flag) uses only just one thread. Concurrent Dispatch Queue (global queue, created queues with concurrent flag) uses multiple threads (aka thread pool). The number of thread is vary with system, situation.
Take a look at the following code.
dispatch_async(queue, ^(){
/* Task 1 */
dispatch_sync(queue, ^(){
/* Task 2 */
});
});
Task 1 and Task 2 should be executed on the same order as it was queued. Thus, Task 1 is executed, and then Task 2.
On Serial Dispatch Queue, dispatch_sync have to wait in order to execute Task 2 on the thread that is executing Task 1 right now. DEADLOCK.
On Concurrent Dispatch Queue, dispatch_sync usually doesn't need to wait to execute Task 2 on a thread in the thread pool. But the number of thread in the thread pool is not unlimited actually, sometimes dispatch_sync have to wait until some other task finished. That's why "but should also be avoided for concurrent queues". dispatch_sync is also highly optimized, it uses the same thread of Task 1 for Task 2 in some situation.
EDITED
Thus, dispatch_sync a block means the exactly same as ordinary block(function) call. In this case, DEADLOCK never happened.
EDITED
Test code.
#import <Foundation/Foundation.h>
void task2()
{
NSLog(#"task2: %#", [NSThread currentThread]);
}
void task1(dispatch_queue_t q)
{
NSLog(#"task1: %#", [NSThread currentThread]);
dispatch_sync(q, ^{
task2();
});
}
int main()
{
dispatch_queue_t q = dispatch_get_global_queue(0, 0);
dispatch_async(q, ^{
task1(q);
});
dispatch_main();
return 0;
}
lldb log
(lldb) breakpoint set -l 6
(lldb) run
task1: <NSThread: 0x1001155a0>{number = 2, name = (null)}
task2: <NSThread: 0x1001155a0>{number = 2, name = (null)}
Process stopped
(lldb) bt
* thread #2: tid = 0x4dbcc, 0x0000000100000d34 a.out`task2 + 4 at a.m:5, queue = 'com.apple.root.default-qos', stop reason = breakpoint 1.1
* frame #0: 0x0000000100000d34 a.out`task2 + 4 at a.m:5
frame #1: 0x0000000100000dc5 a.out`__task1_block_invoke(.block_descriptor=<unavailable>) + 21 at a.m:12
frame #2: 0x00007fff8d6d6c13 libdispatch.dylib`_dispatch_client_callout + 8
frame #3: 0x00007fff8d6e19a1 libdispatch.dylib`_dispatch_sync_f_invoke + 39
frame #4: 0x0000000100000da3 a.out`task1(q=0x00007fff79749b40) + 67 at a.m:11
task1 function calls task2 function via libdispatch APIs but it almost the same as ordinary function call.
I have this below in my iOS app.
I am learning GCD. so, trying out the simple things.
Here, The output of this is confusing me.
Why always the 2. set of statements are coming first and then 1.?
Even though I am dispatching the two tasks to GCD, first I am dispatching 1. set first. It is not really a huge task so that 1. set and 2.set will overlap in time. Its just a simple task to print what threads it is running on.
I have run it several times expecting that it would give different results as how it happens in threading environment.
Please describe.
2. Crnt Thread = <NSThread: 0x10920fee0>{name = (null), num = 1}
2. Main thread = <NSThread: 0x10920fee0>{name = (null), num = 1}
1. Crnt Thread = <NSThread: 0x10920fee0>{name = (null), num = 1}
1. Main thread = <NSThread: 0x10920fee0>{name = (null), num = 1}
3. Crnt Thread = <NSThread: 0x10920fee0>{name = (null), num = 1}
3. Main thread = <NSThread: 0x10920fee0>{name = (null), num = 1}
Code here:
void displayAlertView(void *paramContext)
{
NSLog(#"3. Crnt Thread = %#",[NSThread currentThread]);
NSLog(#"3. Main thread = %#", [NSThread mainThread]);
}
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
// Override point for customization after application launch.
dispatch_queue_t myQueue = dispatch_get_main_queue();
AlertViewData *contextData = (AlertViewData *)malloc(sizeof(AlertViewData));
dispatch_async(myQueue,^(void){
NSLog(#"1. Crnt Thread = %#",[NSThread currentThread]);
NSLog(#"1. Main thread = %#", [NSThread mainThread]);
});
if(contextData != NULL)
{
NSLog(#"2. Crnt Thread = %#",[NSThread currentThread]);
NSLog(#"2. Main thread = %#", [NSThread mainThread]);
dispatch_async_f(myQueue, contextData, displayAlertView);
}
return YES;
}
The "2" statements come first because that code is getting executed before the asynchronous block has had a chance to be setup and run. That's the whole point of dispatch_async. Such code gets run on another thread while the current thread continues on its merry way.
If you updated both blocks of code to use a loop that logs 100 logs statements, then you would probably see some mixing of "1" and "2" statements.
But with just the two logs, they happen so fast, the "2" logs complete before the block with the "1" logs has had a chance to kick in. Look at the timestamps in the log to see.
UPDATE
The above was written under the assumption that myQueue was a background queue. As Martin pointed out, it's the main queue. Since it is the main queue, the answer is quite a bit different.
Since you are doing asynchronous calls on the main queue, everything is done on the same main thread. Each call to dispatch_async is like adding it to the end of the line.
The currently running code is at the head of the line. When you call dispatch_async for the block with the "1" logs, that block is added to the end of the line and will be run when the current code is done. Then you call the dispatch_async_f for the "3" logs. Those get added to the end the line (after the "1" logs).
So once the current runloop completes (and the didFinishLaunchingWithOptions` method returns), then the next bit in line is run. This is your "1" logs. When that is done, the next block in the queue is run (your "3" logs).