GCD differences in Swift 3 - ios

I was studying Grand Central Dispatch when I noticed Swift 3 changed its syntax.
So, is this:
let queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)
dispatch_async(queue) { () -> Void in
let img1 = Downloader.downloadImageWithURL(imageURLs[0])
dispatch_async(dispatch_get_main_queue(), {
self.imageView1.image = img1
})
}
any different from this one?
DispatchQueue.global(qos: .default).async { [weak self]
() -> Void in
let img1 = Downloader.downloadImageWithURL(imageURLs[0])
DispatchQueue.main.async {
()->Void in
self?.imageView1.image = img1
}
}
Should I create a variable to contain DispatchQueue.global(qos: .default).async?

Swift 3 brings many improvements to Grand Central Dispatch syntax and usage.
Previously, we would choose the dispatch method (sync vs async) and then the queue we wanted to dispatch our task to. The updated GCD reverses this order - we first choose the queue and then apply a dispatch method.
DispatchQueue.global(attributes: [.qosDefault]).async {
// Background thread
DispatchQueue.main.async(execute: {
// UI Updates
})
}
Queue attributes
You will notice that queues now take attributes on init. This is a Swift OptionSet and can include queue options such as serial vs concurrent, memory and activity management options and the quality of service (.default, .userInteractive, .userInitiated, .utility and .background).
The quality of service replaces the old priority attributes that were deprecated in iOS8. If you were used to priority queues, here’s how they map over to QOS cases:
* DISPATCH_QUEUE_PRIORITY_HIGH: .userInitiated
* DISPATCH_QUEUE_PRIORITY_DEFAULT: .default
* DISPATCH_QUEUE_PRIORITY_LOW: .utility
* DISPATCH_QUEUE_PRIORITY_BACKGROUND: .background
Work items
Queues are not the only part of GCD to get a Swift OptionSet. There’s an updated Swift syntax for work items too:
let workItem = DispatchWorkItem(qos: .userInitiated, flags: .assignCurrentContext) {
// Do stuff
}
queue.async(execute: workItem)
A work item can now declare a quality or service and/or flags on init. Both of these are optional and affect the execution of the work item.
dispatch_once
dispatch_once was very useful for initialisation code and other functions that were to be executed once and only once.
In Swift 3, dispatch_once is deprecated and should be replaced with either global or static variables and constants.
// Examples of dispatch_once replacements with global or static constants and variables.
// In all three, the initialiser is called only once.
// Static properties (useful for singletons).
class Object {
static let sharedInstance = Object()
}
// Global constant.
let constant = Object()
// Global variable.
var variable: Object = {
let variable = Object()
variable.doSomething()
return variable
}()
dispatch_assert
Also new in this year’s Apple OS releases are dispatch preconditions. These replace dispatch_assert and allow you to check whether or not you are on the expected thread before executing code. This is particularly useful for functions that update the UI and must be executed on the main queue. Here’s a simple example:
let queue = DispatchQueue.global(attributes: .qosUserInitiated)
let mainQueue = DispatchQueue.main
mainQueue.async {
dispatchPrecondition(condition: .notOnQueue(mainQueue))
// This code won't execute
}
queue.async {
dispatchPrecondition(condition: .onQueue(queue))
// This code will execute
}
Source: https://medium.com/swift-and-ios-writing/a-quick-look-at-gcd-and-swift-3-732bef6e1838#.7hdtfwxb4

Beside not weakifying self in the first approach, both calls are equivalent.
Whether or not to create a variable is up to your (convenience) preferences and does not make a difference technically.

Related

How to use gcd barrier in iOS?

I want to use gcd barrier implement a safe store object. But it not work correctly. The setter sometime is more early than the getter. What's wrong with it?
https://gist.github.com/Terriermon/02c446d1238ad6ec1edb08b607b1bf05
class MutiReadSingleWriteObject<T> {
let queue = DispatchQueue(label: "com.readwrite.concurrency", attributes: .concurrent)
var _object:T?
var object: T? {
#available(*, unavailable)
get {
fatalError("You cannot read from this object.")
}
set {
queue.async(flags: .barrier) {
self._object = newValue
}
}
}
func getObject(_ closure: #escaping (T?) -> Void) {
queue.async {
closure(self._object)
}
}
}
func testMutiReadSingleWriteObject() {
let store = MutiReadSingleWriteObject<Int>()
let queue = DispatchQueue(label: "com.come.concurrency", attributes: .concurrent)
for i in 0...100 {
queue.async {
store.getObject { obj in
print("\(i) -- \(String(describing: obj))")
}
}
}
print("pre --- ")
store.object = 1
print("after ---")
store.getObject { obj in
print("finish result -- \(String(describing: obj))")
}
}
Whenever you create a DispatchQueue, whether serial or concurrent, it spawns its own thread that it uses to schedule and run work items on. This means that whenever you instantiate a MutiReadSingleWriteObject<T> object, its queue will have a dedicated thread for synchronizing your setter and getObject method.
However: this also means that in your testMutiReadSingleWriteObject method, the queue that you use to execute the 100 getObject calls in a loop has its own thread too. This means that the method has 3 separate threads to coordinate between:
The thread that testMutiReadSingleWriteObject is called on (likely the main thread),
The thread that store.queue maintains, and
The thread that queue maintains
These threads run their work in parallel, and this means that an async dispatch call like
queue.async {
store.getObject { ... }
}
will enqueue a work item to run on queue's thread at some point, and keep executing code on the current thread.
This means that by the time you get to running store.object = 1, you are guaranteed to have scheduled 100 work items on queue, but crucially, how and when those work items actually start executing are up to the queue, the CPU scheduler, and other environmental factors. While somewhat rare, this does mean that there's a chance that none of those tasks have gotten to run before the assignment of store.object = 1, which means that by the time they do happen, they'll see a value of 1 stored in the object.
In terms of ordering, you might see a combination of:
100 getObject calls, then store.object = 1
N getObject calls, then store.object = 1, then (100 - N) getObject calls
store.object = 1, then 100 getObject calls
Case (2) can actually prove the behavior you're looking to confirm: all of the calls before store.object = 1 should return nil, and all of the ones after should return 1. If you have a getObject call after the setter that returns nil, you'd know you have a problem. But, this is pretty much impossible to control the timing of.
In terms of how to address the timing issue here: for this method to be meaningful, you'll need to drop one thread to properly coordinate all of your calls to store, so that all accesses to it are on the same thread.
This can be done by either:
Dropping queue, and just accessing store on the thread that the method was called on. This does mean that you cannot call store.getObject asynchronously
Make all calls through queue, whether sync or async. This gives you the opportunity to better control exactly how the store methods are called
Either way, both of these approaches can have different semantics, so it's up to you to decide what you want this method to be testing. Do you want to be guaranteed that all 100 calls will go through before store.object = 1 is reached? If so, you can get rid of queue entirely, because you don't actually want those getters to be called asynchronously. Or, do you want to try to cause the getters and the setter to overlap in some way? Then stick with queue, but it'll be more difficult to ensure you get meaningful results, because you aren't guaranteed to have stable ordering with the concurrent calls.

How to get cancellation state for multiple DispatchWorkItems

Background
I'm implementing a search. Each search query results in one DispatchWorkItem which is then queued for execution. As the user can trigger a new search faster than the previous one can be completed, I'd like to cancel the previous one as soon as I receive a new one.
This is my current setup:
var currentSearchJob: DispatchWorkItem?
let searchJobQueue = DispatchQueue(label: QUEUE_KEY)
func updateSearchResults(for searchController: UISearchController) {
let queryString = searchController.searchBar.text?.lowercased() ?? ""
// if there is already an (older) search job running, cancel it
currentSearchJob?.cancel()
// create a new search job
currentSearchJob = DispatchWorkItem() {
self.filter(queryString: queryString)
}
// start the new job
searchJobQueue.async(execute: currentSearchJob!)
}
Problem
I understand that dispatchWorkItem.cancel() doesn't kill the running task immediately. Instead, I need to check for dispatchWorkItem.isCancelled manually. But how do I get the right dispatchWorkItemobject in this case?
If I were setting currentSearchJob only once, I could simply access that attribute like done in this case. However, this isn't applicable here, because the attribute will be overriden before the filter() method will be finished. How do I know which instance is actually running the code in which I want to check for dispatchWorkItem.isCancelled?
Ideally, I'd like to provide the newly-created DispatchWorkItem as an additional parameter to the filter() method. But that's not possible, because I'll get a Variable used within its own initial value error.
I'm new to Swift, so I hope I'm just missing something. Any help is appreciated very much!
The trick is how to have a dispatched task check if it has been canceled. I'd actually suggest consider OperationQueue approach, rather than using dispatch queues directly.
There are at least two approaches:
Most elegant, IMHO, is to just subclass Operation, passing whatever you want to it in the init method, and performing the work in the main method:
class SearchOperation: Operation {
private var queryString: String
init(queryString: String) {
self.queryString = queryString
super.init()
}
override func main() {
// do something synchronous, periodically checking `isCancelled`
// e.g., for illustrative purposes
print("starting \(queryString)")
for i in 0 ... 10 {
if isCancelled { print("canceled \(queryString)"); return }
print(" \(queryString): \(i)")
heavyWork()
}
print("finished \(queryString)")
}
func heavyWork() {
Thread.sleep(forTimeInterval: 0.5)
}
}
Because that's in an Operation subclass, isCancelled is implicitly referencing itself rather than some ivar, avoiding any confusion about what it's checking. And your "start a new query" code can just say "cancel anything currently on the the relevant operation queue and add a new operation onto that queue":
private var searchQueue: OperationQueue = {
let queue = OperationQueue()
// queue.maxConcurrentOperationCount = 1 // make it serial if you want
queue.name = Bundle.main.bundleIdentifier! + ".backgroundQueue"
return queue
}()
func performSearch(for queryString: String) {
searchQueue.cancelAllOperations()
let operation = SearchOperation(queryString: queryString)
searchQueue.addOperation(operation)
}
I recommend this approach as you end up with a small cohesive object, the operation, that nicely encapsulates a block of work that you want to do, in the spirit of the Single Responsibility Principle.
While the following is less elegant, technically you can also use BlockOperation, which is block-based, but for which which you can decouple the creation of the operation, and the adding of the closure to the operation. Using this technique, you can actually pass a reference to the operation to its own closure:
private weak var lastOperation: Operation?
func performSearch(for queryString: String) {
lastOperation?.cancel()
let operation = BlockOperation()
operation.addExecutionBlock { [weak operation, weak self] in
print("starting \(identifier)")
for i in 0 ... 10 {
if operation?.isCancelled ?? true { print("canceled \(identifier)"); return }
print(" \(identifier): \(i)")
self?.heavyWork()
}
print("finished \(identifier)")
}
searchQueue.addOperation(operation)
lastOperation = operation
}
func heavyWork() {
Thread.sleep(forTimeInterval: 0.5)
}
I only mention this for the sake of completeness. I think the Operation subclass approach is frequently a better design. I'll use BlockOperation for one-off sort of stuff, but as soon as I want more sophisticated cancelation logic, I think the Operation subclass approach is better.
I should also mention that, in addition to more elegant cancelation capabilities, Operation objects offer all sorts of other sophisticated capabilities (e.g. asynchronously manage queue of tasks that are, themselves, asynchronous; constrain degree of concurrency; etc.). This is all beyond the scope of this question.
you wrote
Ideally, I'd like to provide the newly-created DispatchWorkItem as an
additional parameter
you are wrong, to be able to cancel running task, you need a reference to it, not to the next which is ready to dispatch.
cancel() doesn't cancel running task, it only set internal "isCancel" flag by the thread-safe way, or remove the task from the queue before execution. Once executed, checking isCancel give you a chance to finish the job (early return).
import PlaygroundSupport
import Foundation
PlaygroundPage.current.needsIndefiniteExecution = true
let queue = DispatchQueue.global(qos: .background)
let prq = DispatchQueue(label: "print.queue")
var task: DispatchWorkItem?
func work(task: DispatchWorkItem?) {
sleep(1)
var d = Date()
if task?.isCancelled ?? true {
prq.async {
print("cancelled", d)
}
return
}
sleep(3)
d = Date()
prq.async {
print("finished", d)
}
}
for _ in 0..<3 {
task?.cancel()
let item = DispatchWorkItem {
work(task: task)
}
item.notify(queue: prq) {
print("done")
}
queue.asyncAfter(deadline: .now() + 0.5, execute: item)
task = item
sleep(1) // comment this line
}
in this example, only the very last job is really fully executed
cancelled 2018-12-17 23:49:13 +0000
done
cancelled 2018-12-17 23:49:14 +0000
done
finished 2018-12-17 23:49:18 +0000
done
try to comment the last line and it prints
done
done
finished 2018-12-18 00:07:28 +0000
done
the difference is, that first two execution never happened. (were removed from the dispatch queue before execution)

Swift 3 Cannot convert value of type '()' to expected argument type '__OS_dispatch_queue_attr?' [duplicate]

In Swift 2, I was able to create queue with the following code:
let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT)
But this doesn't compile in Swift 3.
What is the preferred way to write this in Swift 3?
Creating a concurrent queue
let concurrentQueue = DispatchQueue(label: "queuename", attributes: .concurrent)
concurrentQueue.sync {
}
Create a serial queue
let serialQueue = DispatchQueue(label: "queuename")
serialQueue.sync {
}
Get main queue asynchronously
DispatchQueue.main.async {
}
Get main queue synchronously
DispatchQueue.main.sync {
}
To get one of the background thread
DispatchQueue.global(qos: .background).async {
}
Xcode 8.2 beta 2:
To get one of the background thread
DispatchQueue.global(qos: .default).async {
}
DispatchQueue.global().async {
// qos' default value is ´DispatchQoS.QoSClass.default`
}
If you want to learn about using these queues .See this answer
Compiles under >=Swift 3. This example contains most of the syntax that we need.
QoS - new quality of service syntax
weak self - to disrupt retain cycles
if self is not available, do nothing
async global utility queue - for network query, does not wait for the result, it is a concurrent queue, the block (usually) does not wait when started. Exception for a concurrent queue could be, when its task limit has been previously reached, then the queue temporarily turns into a serial queue and waits until some previous task in that queue completes.
async main queue - for touching the UI, the block does not wait for the result, but waits for its slot at the start. The main queue is a serial queue.
Of course, you need to add some error checking to this...
DispatchQueue.global(qos: .utility).async { [weak self] () -> Void in
guard let strongSelf = self else { return }
strongSelf.flickrPhoto.loadLargeImage { loadedFlickrPhoto, error in
if error != nil {
print("error:\(error)")
} else {
DispatchQueue.main.async { () -> Void in
activityIndicator.removeFromSuperview()
strongSelf.imageView.image = strongSelf.flickrPhoto.largeImage
}
}
}
}
Compiled in XCode 8, Swift 3
https://github.com/rpthomas/Jedisware
#IBAction func tap(_ sender: AnyObject) {
let thisEmail = "emailaddress.com"
let thisPassword = "myPassword"
DispatchQueue.global(qos: .background).async {
// Validate user input
let result = self.validate(thisEmail, password: thisPassword)
// Go back to the main thread to update the UI
DispatchQueue.main.async {
if !result
{
self.displayFailureAlert()
}
}
}
}
Since the OP question has already been answered above I just want to add some speed considerations:
It makes a lot of difference what priority class you assign to your async function in DispatchQueue.global.
I don't recommend running tasks with the .background thread priority especially on the iPhone X where the task seems to be allocated on the low power cores.
Here is some real data from a computationally intensive function that reads from an XML file (with buffering) and performs data interpolation:
Device name / .background / .utility / .default / .userInitiated / .userInteractive
iPhone X: 18.7s / 6.3s / 1.8s / 1.8s / 1.8s
iPhone 7: 4.6s / 3.1s / 3.0s / 2.8s / 2.6s
iPhone 5s: 7.3s / 6.1s / 4.0s / 4.0s / 3.8s
Note that the data set is not the same for all devices. It's the biggest on the iPhone X and the smallest on the iPhone 5s.
Update for swift 5
Serial Queue
let serialQueue = DispatchQueue.init(label: "serialQueue")
serialQueue.async {
// code to execute
}
Concurrent Queue
let concurrentQueue = DispatchQueue.init(label: "concurrentQueue", qos: .background, attributes: .concurrent, autoreleaseFrequency: .inherit, target: nil)
concurrentQueue.async {
// code to execute
}
From Apple documentation:
Parameters
label
A string label to attach to the queue to uniquely identify it in debugging tools such as Instruments, sample, stackshots, and crash reports. Because applications, libraries, and frameworks can all create their own dispatch queues, a reverse-DNS naming style (com.example.myqueue) is recommended. This parameter is optional and can be NULL.
qos
The quality-of-service level to associate with the queue. This value determines the priority at which the system schedules tasks for execution. For a list of possible values, see DispatchQoS.QoSClass.
attributes
The attributes to associate with the queue. Include the concurrent attribute to create a dispatch queue that executes tasks concurrently. If you omit that attribute, the dispatch queue executes tasks serially.
autoreleaseFrequency
The frequency with which to autorelease objects created by the blocks that the queue schedules. For a list of possible values, see DispatchQueue.AutoreleaseFrequency.
target
The target queue on which to execute blocks. Specify DISPATCH_TARGET_QUEUE_DEFAULT if you want the system to provide a queue that is appropriate for the current object.
I did this and this is especially important if you want to refresh your UI to show new data without user noticing like in UITableView or UIPickerView.
DispatchQueue.main.async
{
/*Write your thread code here*/
}
DispatchQueue.main.async {
self.collectionView?.reloadData() // Depends if you were populating a collection view or table view
}
OperationQueue.main.addOperation {
self.lblGenre.text = self.movGenre
}
//use Operation Queue if you need to populate the objects(labels, imageview, textview) on your viewcontroller
let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT) //Swift 2 version
let concurrentQueue = DispatchQueue(label:"com.swift3.imageQueue", attributes: .concurrent) //Swift 3 version
I re-worked your code in Xcode 8, Swift 3 and the changes are marked in contrast to your Swift 2 version.
Swift 3
you want call some closure in swift code then you want to change in storyboard ya any type off change belong to view your application will crash
but you want to use dispatch method your application will not crash
async method
DispatchQueue.main.async
{
//Write code here
}
sync method
DispatchQueue.main.sync
{
//Write code here
}
DispatchQueue.main.async(execute: {
// write code
})
Serial Queue :
let serial = DispatchQueue(label: "Queuename")
serial.sync {
//Code Here
}
Concurrent queue :
let concurrent = DispatchQueue(label: "Queuename", attributes: .concurrent)
concurrent.sync {
//Code Here
}
For Swift 3
DispatchQueue.main.async {
// Write your code here
}
let newQueue = DispatchQueue(label: "newname")
newQueue.sync {
// your code
}
it is now simply:
let serialQueue = DispatchQueue(label: "my serial queue")
the default is serial, to get concurrent, you use the optional attributes argument .concurrent
DispatchQueue.main.async(execute: {
// code
})
You can create dispatch queue using this code in swift 3.0
DispatchQueue.main.async
{
/*Write your code here*/
}
/* or */
let delayTime = DispatchTime.now() + Double(Int64(0.5 * Double(NSEC_PER_SEC))) / Double(NSEC_PER_SEC)
DispatchQueue.main.asyncAfter(deadline: delayTime)
{
/*Write your code here*/
}

How to create dispatch queue in Swift 3

In Swift 2, I was able to create queue with the following code:
let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT)
But this doesn't compile in Swift 3.
What is the preferred way to write this in Swift 3?
Creating a concurrent queue
let concurrentQueue = DispatchQueue(label: "queuename", attributes: .concurrent)
concurrentQueue.sync {
}
Create a serial queue
let serialQueue = DispatchQueue(label: "queuename")
serialQueue.sync {
}
Get main queue asynchronously
DispatchQueue.main.async {
}
Get main queue synchronously
DispatchQueue.main.sync {
}
To get one of the background thread
DispatchQueue.global(qos: .background).async {
}
Xcode 8.2 beta 2:
To get one of the background thread
DispatchQueue.global(qos: .default).async {
}
DispatchQueue.global().async {
// qos' default value is ´DispatchQoS.QoSClass.default`
}
If you want to learn about using these queues .See this answer
Compiles under >=Swift 3. This example contains most of the syntax that we need.
QoS - new quality of service syntax
weak self - to disrupt retain cycles
if self is not available, do nothing
async global utility queue - for network query, does not wait for the result, it is a concurrent queue, the block (usually) does not wait when started. Exception for a concurrent queue could be, when its task limit has been previously reached, then the queue temporarily turns into a serial queue and waits until some previous task in that queue completes.
async main queue - for touching the UI, the block does not wait for the result, but waits for its slot at the start. The main queue is a serial queue.
Of course, you need to add some error checking to this...
DispatchQueue.global(qos: .utility).async { [weak self] () -> Void in
guard let strongSelf = self else { return }
strongSelf.flickrPhoto.loadLargeImage { loadedFlickrPhoto, error in
if error != nil {
print("error:\(error)")
} else {
DispatchQueue.main.async { () -> Void in
activityIndicator.removeFromSuperview()
strongSelf.imageView.image = strongSelf.flickrPhoto.largeImage
}
}
}
}
Compiled in XCode 8, Swift 3
https://github.com/rpthomas/Jedisware
#IBAction func tap(_ sender: AnyObject) {
let thisEmail = "emailaddress.com"
let thisPassword = "myPassword"
DispatchQueue.global(qos: .background).async {
// Validate user input
let result = self.validate(thisEmail, password: thisPassword)
// Go back to the main thread to update the UI
DispatchQueue.main.async {
if !result
{
self.displayFailureAlert()
}
}
}
}
Since the OP question has already been answered above I just want to add some speed considerations:
It makes a lot of difference what priority class you assign to your async function in DispatchQueue.global.
I don't recommend running tasks with the .background thread priority especially on the iPhone X where the task seems to be allocated on the low power cores.
Here is some real data from a computationally intensive function that reads from an XML file (with buffering) and performs data interpolation:
Device name / .background / .utility / .default / .userInitiated / .userInteractive
iPhone X: 18.7s / 6.3s / 1.8s / 1.8s / 1.8s
iPhone 7: 4.6s / 3.1s / 3.0s / 2.8s / 2.6s
iPhone 5s: 7.3s / 6.1s / 4.0s / 4.0s / 3.8s
Note that the data set is not the same for all devices. It's the biggest on the iPhone X and the smallest on the iPhone 5s.
Update for swift 5
Serial Queue
let serialQueue = DispatchQueue.init(label: "serialQueue")
serialQueue.async {
// code to execute
}
Concurrent Queue
let concurrentQueue = DispatchQueue.init(label: "concurrentQueue", qos: .background, attributes: .concurrent, autoreleaseFrequency: .inherit, target: nil)
concurrentQueue.async {
// code to execute
}
From Apple documentation:
Parameters
label
A string label to attach to the queue to uniquely identify it in debugging tools such as Instruments, sample, stackshots, and crash reports. Because applications, libraries, and frameworks can all create their own dispatch queues, a reverse-DNS naming style (com.example.myqueue) is recommended. This parameter is optional and can be NULL.
qos
The quality-of-service level to associate with the queue. This value determines the priority at which the system schedules tasks for execution. For a list of possible values, see DispatchQoS.QoSClass.
attributes
The attributes to associate with the queue. Include the concurrent attribute to create a dispatch queue that executes tasks concurrently. If you omit that attribute, the dispatch queue executes tasks serially.
autoreleaseFrequency
The frequency with which to autorelease objects created by the blocks that the queue schedules. For a list of possible values, see DispatchQueue.AutoreleaseFrequency.
target
The target queue on which to execute blocks. Specify DISPATCH_TARGET_QUEUE_DEFAULT if you want the system to provide a queue that is appropriate for the current object.
I did this and this is especially important if you want to refresh your UI to show new data without user noticing like in UITableView or UIPickerView.
DispatchQueue.main.async
{
/*Write your thread code here*/
}
DispatchQueue.main.async {
self.collectionView?.reloadData() // Depends if you were populating a collection view or table view
}
OperationQueue.main.addOperation {
self.lblGenre.text = self.movGenre
}
//use Operation Queue if you need to populate the objects(labels, imageview, textview) on your viewcontroller
let concurrentQueue = dispatch_queue_create("com.swift3.imageQueue", DISPATCH_QUEUE_CONCURRENT) //Swift 2 version
let concurrentQueue = DispatchQueue(label:"com.swift3.imageQueue", attributes: .concurrent) //Swift 3 version
I re-worked your code in Xcode 8, Swift 3 and the changes are marked in contrast to your Swift 2 version.
Swift 3
you want call some closure in swift code then you want to change in storyboard ya any type off change belong to view your application will crash
but you want to use dispatch method your application will not crash
async method
DispatchQueue.main.async
{
//Write code here
}
sync method
DispatchQueue.main.sync
{
//Write code here
}
DispatchQueue.main.async(execute: {
// write code
})
Serial Queue :
let serial = DispatchQueue(label: "Queuename")
serial.sync {
//Code Here
}
Concurrent queue :
let concurrent = DispatchQueue(label: "Queuename", attributes: .concurrent)
concurrent.sync {
//Code Here
}
For Swift 3
DispatchQueue.main.async {
// Write your code here
}
let newQueue = DispatchQueue(label: "newname")
newQueue.sync {
// your code
}
it is now simply:
let serialQueue = DispatchQueue(label: "my serial queue")
the default is serial, to get concurrent, you use the optional attributes argument .concurrent
DispatchQueue.main.async(execute: {
// code
})
You can create dispatch queue using this code in swift 3.0
DispatchQueue.main.async
{
/*Write your code here*/
}
/* or */
let delayTime = DispatchTime.now() + Double(Int64(0.5 * Double(NSEC_PER_SEC))) / Double(NSEC_PER_SEC)
DispatchQueue.main.asyncAfter(deadline: delayTime)
{
/*Write your code here*/
}

How does threading (asynchronous queues) work in Swift?

ok I am updating this question but left the old one there.
So I have an array that stores the data for different views in a uipageviewcontroller. I need to grab image data in the background. I don't understand how to code this though within an asynchronous task.
Heres the code for the task:
let queue = NSOperationQueue()
queue.addOperationWithBlock() {
// do something in the background
println("background")
self.cards[newIndex].loadImag()
var cardimages = self.cards[newIndex].images
NSOperationQueue.mainQueue().addOperationWithBlock() {
// when done, update your UI and/or model on the main queue
println("update ui")
self.cards[newIndex].images = cardimages
}
}
this is what the .loadImag() function looks like:
func loadImag(){
println("images= \(self.images)")
if self.
location_id != nil && (self.images == nil) {
println("before api call loc_id= \(self.location_id)")
ApiWrapper.getPictures(self.location_id!, completionHandler: self.imagesCallback)
}
}
}
and this is self.imagesCallback code:
private func imagesCallback(cardImagesArray: [CardImage]){
println("images callback id= \(self.location_id)")
self.images = cardImagesArray
}
problem is I am not sure how to put this code inside of the operation cue since the function must have a callback. How can I get the operation queue working so that it updates the self.card array in the uipageviewcontroller?
OLD QUESTION_________________:
So I have this line of code I need to run concurrently in a different thread than the main thread. When I add it to the main queue like so:
var queue = dispatch_get_main_queue()
dispatch_async(queue, {
self.cards[newIndex].loadImage()
})
doing this it works fine but doesn't seem to run concurrently. When I change the queue to concurrent like this:
dispatch_async(DISPATCH_QUEUE_CONCURRENT, {
self.cards[newIndex].loadImage()
})
The app crashes saying "EXC_BAD_ACCESS". What am I doing wrong here? Also when I run the self.cards[newIndex].loadImage() function in a different concurrent thread will this update the values in the main thread?
you shouldn't use GCD unless you want to explicitly use functionality which is only available on GCD. For your case it is more beneficial (and cleaner in code) to use NSOperationQueue. NSOperationQueue uses GCD in the background and is more secure (less ways to mess up)
let queue = NSOperationQueue()
queue.addOperationWithBlock() {
// do something in the background
NSOperationQueue.mainQueue().addOperationWithBlock() {
// when done, update your UI and/or model on the main queue
}
}
You can also read through the Apple Concurrency Programming Guide
The guide is using examples with Objective-C but API is basically the same for Swift.
It also might look strange but with the addOperationWithBlock() I used something called "trailing closure" you can read here about it
Can you paste the whole code so we can see what are you doing?
Below is very basic code snippet. This is basically how concurrency works in Swift.
let qos = Int(QOS_CLASS_USER_INITIATED.value)
dispatch_async(dispatch_get_global_queue(qos, 0), { () -> Void in
// Put your code here to work in the background
dispatch_async(dispatch_get_main_queue(), { () -> Void in
// Put your code here when the process is done and hand over to the main thread.
// Ex. self.cards[newIndex].loadImage()
})
})
You need to use dispatch_get_global_queue . Try something like:
let queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)
dispatch_async(queue, {self.cards[newIndex].loadImage()})
dispatch_get_main_queue(), as you were trying, runs on the UI/main thread, which is why you saw the behavior you did.
To answer the second part of your question, If loadImage() is modifying the UI, you don't want to do that from a background thread. It must be done from the main/UI thread. A typical idiom would be, from the main thread do:
let queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)
dispatch_async(queue, {
<code to load/prepare images>
dispatch_async(dispatch_get_main_queue(), {
<code to update UI elements>
})
})

Resources