How to create dispatch queue in Swift 3 - ios

In Swift 2, I was able to create queue with the following code:
let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT)
But this doesn't compile in Swift 3.
What is the preferred way to write this in Swift 3?

Creating a concurrent queue
let concurrentQueue = DispatchQueue(label: "queuename", attributes: .concurrent)
concurrentQueue.sync {
}
Create a serial queue
let serialQueue = DispatchQueue(label: "queuename")
serialQueue.sync {
}
Get main queue asynchronously
DispatchQueue.main.async {
}
Get main queue synchronously
DispatchQueue.main.sync {
}
To get one of the background thread
DispatchQueue.global(qos: .background).async {
}
Xcode 8.2 beta 2:
To get one of the background thread
DispatchQueue.global(qos: .default).async {
}
DispatchQueue.global().async {
// qos' default value is ´DispatchQoS.QoSClass.default`
}
If you want to learn about using these queues .See this answer

Compiles under >=Swift 3. This example contains most of the syntax that we need.
QoS - new quality of service syntax
weak self - to disrupt retain cycles
if self is not available, do nothing
async global utility queue - for network query, does not wait for the result, it is a concurrent queue, the block (usually) does not wait when started. Exception for a concurrent queue could be, when its task limit has been previously reached, then the queue temporarily turns into a serial queue and waits until some previous task in that queue completes.
async main queue - for touching the UI, the block does not wait for the result, but waits for its slot at the start. The main queue is a serial queue.
Of course, you need to add some error checking to this...
DispatchQueue.global(qos: .utility).async { [weak self] () -> Void in
guard let strongSelf = self else { return }
strongSelf.flickrPhoto.loadLargeImage { loadedFlickrPhoto, error in
if error != nil {
print("error:\(error)")
} else {
DispatchQueue.main.async { () -> Void in
activityIndicator.removeFromSuperview()
strongSelf.imageView.image = strongSelf.flickrPhoto.largeImage
}
}
}
}

Compiled in XCode 8, Swift 3
https://github.com/rpthomas/Jedisware
#IBAction func tap(_ sender: AnyObject) {
let thisEmail = "emailaddress.com"
let thisPassword = "myPassword"
DispatchQueue.global(qos: .background).async {
// Validate user input
let result = self.validate(thisEmail, password: thisPassword)
// Go back to the main thread to update the UI
DispatchQueue.main.async {
if !result
{
self.displayFailureAlert()
}
}
}
}

Since the OP question has already been answered above I just want to add some speed considerations:
It makes a lot of difference what priority class you assign to your async function in DispatchQueue.global.
I don't recommend running tasks with the .background thread priority especially on the iPhone X where the task seems to be allocated on the low power cores.
Here is some real data from a computationally intensive function that reads from an XML file (with buffering) and performs data interpolation:
Device name / .background / .utility / .default / .userInitiated / .userInteractive
iPhone X: 18.7s / 6.3s / 1.8s / 1.8s / 1.8s
iPhone 7: 4.6s / 3.1s / 3.0s / 2.8s / 2.6s
iPhone 5s: 7.3s / 6.1s / 4.0s / 4.0s / 3.8s
Note that the data set is not the same for all devices. It's the biggest on the iPhone X and the smallest on the iPhone 5s.

Update for swift 5
Serial Queue
let serialQueue = DispatchQueue.init(label: "serialQueue")
serialQueue.async {
// code to execute
}
Concurrent Queue
let concurrentQueue = DispatchQueue.init(label: "concurrentQueue", qos: .background, attributes: .concurrent, autoreleaseFrequency: .inherit, target: nil)
concurrentQueue.async {
// code to execute
}
From Apple documentation:
Parameters
label
A string label to attach to the queue to uniquely identify it in debugging tools such as Instruments, sample, stackshots, and crash reports. Because applications, libraries, and frameworks can all create their own dispatch queues, a reverse-DNS naming style (com.example.myqueue) is recommended. This parameter is optional and can be NULL.
qos
The quality-of-service level to associate with the queue. This value determines the priority at which the system schedules tasks for execution. For a list of possible values, see DispatchQoS.QoSClass.
attributes
The attributes to associate with the queue. Include the concurrent attribute to create a dispatch queue that executes tasks concurrently. If you omit that attribute, the dispatch queue executes tasks serially.
autoreleaseFrequency
The frequency with which to autorelease objects created by the blocks that the queue schedules. For a list of possible values, see DispatchQueue.AutoreleaseFrequency.
target
The target queue on which to execute blocks. Specify DISPATCH_TARGET_QUEUE_DEFAULT if you want the system to provide a queue that is appropriate for the current object.

I did this and this is especially important if you want to refresh your UI to show new data without user noticing like in UITableView or UIPickerView.
DispatchQueue.main.async
{
/*Write your thread code here*/
}

DispatchQueue.main.async {
self.collectionView?.reloadData() // Depends if you were populating a collection view or table view
}
OperationQueue.main.addOperation {
self.lblGenre.text = self.movGenre
}
//use Operation Queue if you need to populate the objects(labels, imageview, textview) on your viewcontroller

let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT) //Swift 2 version
let concurrentQueue = DispatchQueue(label:"com.swift3.imageQueue", attributes: .concurrent) //Swift 3 version
I re-worked your code in Xcode 8, Swift 3 and the changes are marked in contrast to your Swift 2 version.

Swift 3
you want call some closure in swift code then you want to change in storyboard ya any type off change belong to view your application will crash
but you want to use dispatch method your application will not crash
async method
DispatchQueue.main.async
{
//Write code here
}
sync method
DispatchQueue.main.sync
{
//Write code here
}

DispatchQueue.main.async(execute: {
// write code
})
Serial Queue :
let serial = DispatchQueue(label: "Queuename")
serial.sync {
//Code Here
}
Concurrent queue :
let concurrent = DispatchQueue(label: "Queuename", attributes: .concurrent)
concurrent.sync {
//Code Here
}

For Swift 3
DispatchQueue.main.async {
// Write your code here
}

let newQueue = DispatchQueue(label: "newname")
newQueue.sync {
// your code
}

it is now simply:
let serialQueue = DispatchQueue(label: "my serial queue")
the default is serial, to get concurrent, you use the optional attributes argument .concurrent

DispatchQueue.main.async(execute: {
// code
})

You can create dispatch queue using this code in swift 3.0
DispatchQueue.main.async
{
/*Write your code here*/
}
/* or */
let delayTime = DispatchTime.now() + Double(Int64(0.5 * Double(NSEC_PER_SEC))) / Double(NSEC_PER_SEC)
DispatchQueue.main.asyncAfter(deadline: delayTime)
{
/*Write your code here*/
}

Related

DispatchSemaphore + DispatchQueue not working as expected?

I'm trying to improve the time it takes for a task to finish by leveraging multithreading/paralleling.
I'm reading CMSampleBuffers from a video track, manipulating them, then storing them back to an array for later use. Because each manipulation is "costly" in terms of RAM and CPU, I'm using DispatchSemaphore to limit the number of parallel tasks (5 in the example).
Basically, I'm trying to make the "system" process more than a single frame in a time, and not above 5 so the device won't crash due to memory issues. In the current implementation below it's for some reason doing it almost serialize and not in parallel.
Any help will be highly appreciated!
I tried taking reference from here: How to limit gcd queue buffer size for the implementation.
Code:
class MyService {
let semaphore = DispatchSemaphore(value: 5)
let processQueue = DispatchQueue(label: "custom.process", attributes: .concurrent)
func startReading() {
for sampleBuffer in sampleBuffers {
// signal wait
semaphore.wait()
// async queue
processQueue.async {
// run taks
self.process(buffer: sampleBuffer) { pixelBuffer in
// singal to semaphore
self.semaphore.signal()
}
}
}
}
func process(buffer: CMSampleBuffer, completion: #escaping (CVPixelBuffer) -> (Void)) {
// run on a background thread to avoid UI freeze
DispatchQueue.global(qos: .userInteractive).async {
// Do something
// Do something
// Do something
// Do something
completion(processedBuffer)
}
}
}

When to use Semaphore instead of Dispatch Group?

I would assume that I am aware of how to work with DispatchGroup, for understanding the issue, I've tried:
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
performUsingGroup()
}
func performUsingGroup() {
let dq1 = DispatchQueue.global(qos: .userInitiated)
let dq2 = DispatchQueue.global(qos: .userInitiated)
let group = DispatchGroup()
group.enter()
dq1.async {
for i in 1...3 {
print("\(#function) DispatchQueue 1: \(i)")
}
group.leave()
}
group.wait()
dq2.async {
for i in 1...3 {
print("\(#function) DispatchQueue 2: \(i)")
}
}
group.notify(queue: DispatchQueue.main) {
print("done by group")
}
}
}
and the result -as expected- is:
performUsingGroup() DispatchQueue 1: 1
performUsingGroup() DispatchQueue 1: 2
performUsingGroup() DispatchQueue 1: 3
performUsingGroup() DispatchQueue 2: 1
performUsingGroup() DispatchQueue 2: 2
performUsingGroup() DispatchQueue 2: 3
done by group
For using the Semaphore, I implemented:
func performUsingSemaphore() {
let dq1 = DispatchQueue.global(qos: .userInitiated)
let dq2 = DispatchQueue.global(qos: .userInitiated)
let semaphore = DispatchSemaphore(value: 1)
dq1.async {
semaphore.wait()
for i in 1...3 {
print("\(#function) DispatchQueue 1: \(i)")
}
semaphore.signal()
}
dq2.async {
semaphore.wait()
for i in 1...3 {
print("\(#function) DispatchQueue 2: \(i)")
}
semaphore.signal()
}
}
and called it in the viewDidLoad method. The result is:
performUsingSemaphore() DispatchQueue 1: 1
performUsingSemaphore() DispatchQueue 1: 2
performUsingSemaphore() DispatchQueue 1: 3
performUsingSemaphore() DispatchQueue 2: 1
performUsingSemaphore() DispatchQueue 2: 2
performUsingSemaphore() DispatchQueue 2: 3
Conceptually, both of DispachGroup and Semaphore serve the same purpose (unless I misunderstand something).
Honestly, I am unfamiliar with: when to use the Semaphore, especially when workin with DispachGroup -probably- handles the issue.
What is the part that I am missing?
Conceptually, both of DispatchGroup and Semaphore serve the same purpose (unless I misunderstand something).
The above is not exactly true. You can use a semaphore to do the same thing as a dispatch group but it is much more general.
Dispatch groups are used when you have a load of things you want to do that can all happen at once, but you need to wait for them all to finish before doing something else.
Semaphores can be used for the above but they are general purpose synchronisation objects and can be used for many other purposes too. The concept of a semaphore is not limited to Apple and can be found in many operating systems.
In general, a semaphore has a value which is a non negative integer and two operations:
wait If the value is not zero, decrement it, otherwise block until something signals the semaphore.
signal If there are threads waiting, unblock one of them, otherwise increment the value.
Needless to say both operations have to be thread safe. In olden days, when you only had one CPU, you'd simply disable interrupts whilst manipulating the value and the queue of waiting threads. Nowadays, it is more complicated because of multiple CPU cores and on chip caches etc.
A semaphore can be used in any case where you have a resource that can be accessed by at most N threads at the same time. You set the semaphore's initial value to N and then the first N threads that wait on it are not blocked but the next thread has to wait until one of the first N threads has signaled the semaphore. The simplest case is N = 1. In that case, the semaphore behaves like a mutex lock.
A semaphore can be used to emulate a dispatch group. You start the sempahore at 0, start all the tasks - tracking how many you have started and wait on the semaphore that number of times. Each task must signal the semaphore when it completes.
However, there are some gotchas. For example, you need a separate count to know how many times to wait. If you want to be able to add more tasks to the group after you have started waiting, the count can only be updated in a mutex protected block and that may lead to problems with deadlocking. Also, I think the Dispatch implementation of semaphores might be vulnerable to priority inversion. Priority inversion occurs when a high priority thread waits for a resource that a low priority has grabbed. The high priority thread is blocked until the low priority thread releases the resource. If there is a medium priority thread running, this may never happen.
You can pretty much do anything with a semaphore that other higher level synchronisation abstractions can do, but doing it right is often a tricky business to get right. The higher level abstractions are (hopefully) carefully written and you should use them in preference to a "roll your own" implementation with semaphores, if possible.
Semaphores and groups have, in a sense, opposite semantics. Both maintain a count. With a semaphore, a wait is allowed to proceed when the count is non-zero. With a group, a wait is allowed to proceed when the count is zero.
A semaphore is useful when you want to set a maximum on the number of threads operating on some shared resource at a time. One common use is when the maximum is 1 because the shared resource requires exclusive access.
A group is useful when you need to know when a bunch of tasks have all been completed.
Use a semaphore to limit the amount of concurrent work at a given time. Use a group to wait for any number of concurrent work to finish execution.
In case you wanted to submit three jobs per queue it should be
import Foundation
func performUsingGroup() {
let dq1 = DispatchQueue(label: "q1", attributes: .concurrent)
let dq2 = DispatchQueue(label: "q2", attributes: .concurrent)
let group = DispatchGroup()
for i in 1...3 {
group.enter()
dq1.async {
print("\(#function) DispatchQueue 1: \(i)")
group.leave()
}
}
for i in 1...3 {
group.enter()
dq2.async {
print("\(#function) DispatchQueue 2: \(i)")
group.leave()
}
}
group.notify(queue: DispatchQueue.main) {
print("done by group")
}
}
performUsingGroup()
RunLoop.current.run(mode: RunLoop.Mode.default, before: Date(timeIntervalSinceNow: 1))
and
import Foundation
func performUsingSemaphore() {
let dq1 = DispatchQueue(label: "q1", attributes: .concurrent)
let dq2 = DispatchQueue(label: "q2", attributes: .concurrent)
let semaphore = DispatchSemaphore(value: 1)
for i in 1...3 {
dq1.async {
_ = semaphore.wait(timeout: DispatchTime.distantFuture)
print("\(#function) DispatchQueue 1: \(i)")
semaphore.signal()
}
}
for i in 1...3 {
dq2.async {
_ = semaphore.wait(timeout: DispatchTime.distantFuture)
print("\(#function) DispatchQueue 2: \(i)")
semaphore.signal()
}
}
}
performUsingSemaphore()
RunLoop.current.run(mode: RunLoop.Mode.default, before: Date(timeIntervalSinceNow: 1))
The replies above by Jano and Ken are correct regarding 1) the use of semaphore to limit the amount of work happening at once 2) the use of a dispatch group so that the group will be notified when all the tasks in the group are done. For example, you may want to download a lot of images in parallel but since you know that they are heavy images, you want to limit to two downloads only at a single time so you use a semaphore. You also want to be notified when all the downloads (say there are 50 of them) are done, so you use DispatchGroup. Thus, it is not a matter of choosing between the two. You may use one or both in the same implementation depending on your goals. This type of example was provided in the Concurrency tutorial on Ray Wenderlich's site:
let group = DispatchGroup()
let queue = DispatchQueue.global(qos: .utility)
let semaphore = DispatchSemaphore(value: 2)
let base = "https://yourbaseurl.com/image-id-"
let ids = [0001, 0002, 0003, 0004, 0005, 0006, 0007, 0008, 0009, 0010, 0011, 0012]
var images: [UIImage] = []
for id in ids {
guard let url = URL(string: "\(base)\(id)-jpeg.jpg") else { continue }
semaphore.wait()
group.enter()
let task = URLSession.shared.dataTask(with: url) { data, _, error in
defer {
group.leave()
semaphore.signal()
}
if error == nil,
let data = data,
let image = UIImage(data: data) {
images.append(image)
}
}
task.resume()
}
One typical semaphore use case is a function that can be called simultaneously from different threads and uses a resource that should not be called from multiple threads at the same time:
func myFunction() {
semaphore.wait()
// access the shared resource
semaphore.signal()
}
In this case you will be able to call myFunction from different threads but they won't be able to reach the locked resource simultaneously. One of them will have to wait until the second one finishes its work.
A semaphore keeps a count, therefore you can actually allow for a given number of threads to enter your function at the same time.
Typical shared resource is the output to a file.
A semaphore is not the only way to solve such problems. You can also add the code to a serial queue, for example.
Semaphores are low level primitives and most likely they are used a lot under the hood in GCD.
Another typical example is the producer-consumer problem, where the signal and wait calls are actually part of two different functions. One which produces data and one which consumes them.
Generally semaphore can be considered mainly that we can solve the critical section problem. Locking certain resource to achieve synchronisation. Also what happens if sleep() is invoked, can we achieve the same thing by using a semaphore ?
Dispatch groups we will use when we have multiple group of operations to be carried out and we need a tracking or set dependencies each other or notification when a group os tasks finishes its execution.

Swift 3 Cannot convert value of type '()' to expected argument type '__OS_dispatch_queue_attr?' [duplicate]

In Swift 2, I was able to create queue with the following code:
let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT)
But this doesn't compile in Swift 3.
What is the preferred way to write this in Swift 3?
Creating a concurrent queue
let concurrentQueue = DispatchQueue(label: "queuename", attributes: .concurrent)
concurrentQueue.sync {
}
Create a serial queue
let serialQueue = DispatchQueue(label: "queuename")
serialQueue.sync {
}
Get main queue asynchronously
DispatchQueue.main.async {
}
Get main queue synchronously
DispatchQueue.main.sync {
}
To get one of the background thread
DispatchQueue.global(qos: .background).async {
}
Xcode 8.2 beta 2:
To get one of the background thread
DispatchQueue.global(qos: .default).async {
}
DispatchQueue.global().async {
// qos' default value is ´DispatchQoS.QoSClass.default`
}
If you want to learn about using these queues .See this answer
Compiles under >=Swift 3. This example contains most of the syntax that we need.
QoS - new quality of service syntax
weak self - to disrupt retain cycles
if self is not available, do nothing
async global utility queue - for network query, does not wait for the result, it is a concurrent queue, the block (usually) does not wait when started. Exception for a concurrent queue could be, when its task limit has been previously reached, then the queue temporarily turns into a serial queue and waits until some previous task in that queue completes.
async main queue - for touching the UI, the block does not wait for the result, but waits for its slot at the start. The main queue is a serial queue.
Of course, you need to add some error checking to this...
DispatchQueue.global(qos: .utility).async { [weak self] () -> Void in
guard let strongSelf = self else { return }
strongSelf.flickrPhoto.loadLargeImage { loadedFlickrPhoto, error in
if error != nil {
print("error:\(error)")
} else {
DispatchQueue.main.async { () -> Void in
activityIndicator.removeFromSuperview()
strongSelf.imageView.image = strongSelf.flickrPhoto.largeImage
}
}
}
}
Compiled in XCode 8, Swift 3
https://github.com/rpthomas/Jedisware
#IBAction func tap(_ sender: AnyObject) {
let thisEmail = "emailaddress.com"
let thisPassword = "myPassword"
DispatchQueue.global(qos: .background).async {
// Validate user input
let result = self.validate(thisEmail, password: thisPassword)
// Go back to the main thread to update the UI
DispatchQueue.main.async {
if !result
{
self.displayFailureAlert()
}
}
}
}
Since the OP question has already been answered above I just want to add some speed considerations:
It makes a lot of difference what priority class you assign to your async function in DispatchQueue.global.
I don't recommend running tasks with the .background thread priority especially on the iPhone X where the task seems to be allocated on the low power cores.
Here is some real data from a computationally intensive function that reads from an XML file (with buffering) and performs data interpolation:
Device name / .background / .utility / .default / .userInitiated / .userInteractive
iPhone X: 18.7s / 6.3s / 1.8s / 1.8s / 1.8s
iPhone 7: 4.6s / 3.1s / 3.0s / 2.8s / 2.6s
iPhone 5s: 7.3s / 6.1s / 4.0s / 4.0s / 3.8s
Note that the data set is not the same for all devices. It's the biggest on the iPhone X and the smallest on the iPhone 5s.
Update for swift 5
Serial Queue
let serialQueue = DispatchQueue.init(label: "serialQueue")
serialQueue.async {
// code to execute
}
Concurrent Queue
let concurrentQueue = DispatchQueue.init(label: "concurrentQueue", qos: .background, attributes: .concurrent, autoreleaseFrequency: .inherit, target: nil)
concurrentQueue.async {
// code to execute
}
From Apple documentation:
Parameters
label
A string label to attach to the queue to uniquely identify it in debugging tools such as Instruments, sample, stackshots, and crash reports. Because applications, libraries, and frameworks can all create their own dispatch queues, a reverse-DNS naming style (com.example.myqueue) is recommended. This parameter is optional and can be NULL.
qos
The quality-of-service level to associate with the queue. This value determines the priority at which the system schedules tasks for execution. For a list of possible values, see DispatchQoS.QoSClass.
attributes
The attributes to associate with the queue. Include the concurrent attribute to create a dispatch queue that executes tasks concurrently. If you omit that attribute, the dispatch queue executes tasks serially.
autoreleaseFrequency
The frequency with which to autorelease objects created by the blocks that the queue schedules. For a list of possible values, see DispatchQueue.AutoreleaseFrequency.
target
The target queue on which to execute blocks. Specify DISPATCH_TARGET_QUEUE_DEFAULT if you want the system to provide a queue that is appropriate for the current object.
I did this and this is especially important if you want to refresh your UI to show new data without user noticing like in UITableView or UIPickerView.
DispatchQueue.main.async
{
/*Write your thread code here*/
}
DispatchQueue.main.async {
self.collectionView?.reloadData() // Depends if you were populating a collection view or table view
}
OperationQueue.main.addOperation {
self.lblGenre.text = self.movGenre
}
//use Operation Queue if you need to populate the objects(labels, imageview, textview) on your viewcontroller
let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT) //Swift 2 version
let concurrentQueue = DispatchQueue(label:"com.swift3.imageQueue", attributes: .concurrent) //Swift 3 version
I re-worked your code in Xcode 8, Swift 3 and the changes are marked in contrast to your Swift 2 version.
Swift 3
you want call some closure in swift code then you want to change in storyboard ya any type off change belong to view your application will crash
but you want to use dispatch method your application will not crash
async method
DispatchQueue.main.async
{
//Write code here
}
sync method
DispatchQueue.main.sync
{
//Write code here
}
DispatchQueue.main.async(execute: {
// write code
})
Serial Queue :
let serial = DispatchQueue(label: "Queuename")
serial.sync {
//Code Here
}
Concurrent queue :
let concurrent = DispatchQueue(label: "Queuename", attributes: .concurrent)
concurrent.sync {
//Code Here
}
For Swift 3
DispatchQueue.main.async {
// Write your code here
}
let newQueue = DispatchQueue(label: "newname")
newQueue.sync {
// your code
}
it is now simply:
let serialQueue = DispatchQueue(label: "my serial queue")
the default is serial, to get concurrent, you use the optional attributes argument .concurrent
DispatchQueue.main.async(execute: {
// code
})
You can create dispatch queue using this code in swift 3.0
DispatchQueue.main.async
{
/*Write your code here*/
}
/* or */
let delayTime = DispatchTime.now() + Double(Int64(0.5 * Double(NSEC_PER_SEC))) / Double(NSEC_PER_SEC)
DispatchQueue.main.asyncAfter(deadline: delayTime)
{
/*Write your code here*/
}

Show thread label in Debug navigator or other tools

In my project a have a lot of background threads. I want to check if every thread works without crashes and disappear when needed. Because my program use 26% CPU. So i labeled every background thread:
let myQueue = DispatchQueue(label: "myQ", qos: .background, target: nil)
myQueue.async {
someFunc()
}
But in Xcode Debug navigator i see unnamed threads:
EDIT
Hm, i found a way:
DispatchQueue.global(qos: .background).async {
Thread.current.name = "my thread"
somefunc()
}
But why then we need label in DispatchQueue ?
Hm, i found a way:
DispatchQueue.global(qos: .background).async {
Thread.current.name = "my thread"
somefunc()
}
But why then we need label in DispatchQueue ?

GCD differences in Swift 3

I was studying Grand Central Dispatch when I noticed Swift 3 changed its syntax.
So, is this:
let queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)
dispatch_async(queue) { () -> Void in
let img1 = Downloader.downloadImageWithURL(imageURLs[0])
dispatch_async(dispatch_get_main_queue(), {
self.imageView1.image = img1
})
}
any different from this one?
DispatchQueue.global(qos: .default).async { [weak self]
() -> Void in
let img1 = Downloader.downloadImageWithURL(imageURLs[0])
DispatchQueue.main.async {
()->Void in
self?.imageView1.image = img1
}
}
Should I create a variable to contain DispatchQueue.global(qos: .default).async?
Swift 3 brings many improvements to Grand Central Dispatch syntax and usage.
Previously, we would choose the dispatch method (sync vs async) and then the queue we wanted to dispatch our task to. The updated GCD reverses this order - we first choose the queue and then apply a dispatch method.
DispatchQueue.global(attributes: [.qosDefault]).async {
// Background thread
DispatchQueue.main.async(execute: {
// UI Updates
})
}
Queue attributes
You will notice that queues now take attributes on init. This is a Swift OptionSet and can include queue options such as serial vs concurrent, memory and activity management options and the quality of service (.default, .userInteractive, .userInitiated, .utility and .background).
The quality of service replaces the old priority attributes that were deprecated in iOS8. If you were used to priority queues, here’s how they map over to QOS cases:
* DISPATCH_QUEUE_PRIORITY_HIGH: .userInitiated
* DISPATCH_QUEUE_PRIORITY_DEFAULT: .default
* DISPATCH_QUEUE_PRIORITY_LOW: .utility
* DISPATCH_QUEUE_PRIORITY_BACKGROUND: .background
Work items
Queues are not the only part of GCD to get a Swift OptionSet. There’s an updated Swift syntax for work items too:
let workItem = DispatchWorkItem(qos: .userInitiated, flags: .assignCurrentContext) {
// Do stuff
}
queue.async(execute: workItem)
A work item can now declare a quality or service and/or flags on init. Both of these are optional and affect the execution of the work item.
dispatch_once
dispatch_once was very useful for initialisation code and other functions that were to be executed once and only once.
In Swift 3, dispatch_once is deprecated and should be replaced with either global or static variables and constants.
// Examples of dispatch_once replacements with global or static constants and variables.
// In all three, the initialiser is called only once.
// Static properties (useful for singletons).
class Object {
static let sharedInstance = Object()
}
// Global constant.
let constant = Object()
// Global variable.
var variable: Object = {
let variable = Object()
variable.doSomething()
return variable
}()
dispatch_assert
Also new in this year’s Apple OS releases are dispatch preconditions. These replace dispatch_assert and allow you to check whether or not you are on the expected thread before executing code. This is particularly useful for functions that update the UI and must be executed on the main queue. Here’s a simple example:
let queue = DispatchQueue.global(attributes: .qosUserInitiated)
let mainQueue = DispatchQueue.main
mainQueue.async {
dispatchPrecondition(condition: .notOnQueue(mainQueue))
// This code won't execute
}
queue.async {
dispatchPrecondition(condition: .onQueue(queue))
// This code will execute
}
Source: https://medium.com/swift-and-ios-writing/a-quick-look-at-gcd-and-swift-3-732bef6e1838#.7hdtfwxb4
Beside not weakifying self in the first approach, both calls are equivalent.
Whether or not to create a variable is up to your (convenience) preferences and does not make a difference technically.

Resources