How can we implement concurrency thread using protocol in swift? [closed] - ios

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I was asked this question in interview for iOS developer role.
// Please design a read-write task queue where you can tag the reader task with label,
// where the the task with the same label should be executed sequentially, and the
// tasks with different labels could be executed concurrently. However, the writer
// would get the exclusive access where no concurrent task would happen at the
// same time with the writer task
// For example:
protocol ConcurrentQueueWithSerialization {
// Submits a labeled task.
// All tasks with the same label will be serialized.
// Tasks with different labels will run concurrently.
// Use this method to submit a "read" operation from a particular reader.
func async(with label: String, task: #escaping () -> Void)
// Submits a task that will run concurrently with all other tasks regardless of their labels.
func async(task: #escaping () -> Void)
// Submits a labeled and delayed task.
func asyncAfter(deadline: DispatchTime, with label: String, task: #escaping () -> Void)
// Submits an unlabeled and delayed task.
func asyncAfter(deadline: DispatchTime, task: #escaping () -> Void)
// Submits a barrier task. Only one barrier task is allowed to run at a time.
// Works as a critical section for the queue.
// Use this method to submit a writer task.
func asyncBarrier(task: #escaping () -> Void)
}
class MyDispatchQueue: ConcurrentQueueWithSerialization {
//TODO: write your implementation
}
Interviewer asked me to implement above protocol in MyDispatchQueue class. I tried but could not find solution. Please help me. Thanks in advance.

Previously I suggested using target queues, but even better, create a primary concurrent queue, and then create serial queues for the named queues, and then dispatch everything through that primary concurrent queue. Unlike the target queue approach, this will honor the scheduling of tasks dispatched to the named queues with those dispatched to the unnamed queue.
With that implementation, here's an example (an Instruments "Points of Interest" profile) of this where I added tasks for queues named "fred" and "ginger" and also one which was added to an unnamed queue, I then added a barrier task, and then added two more tasks to each of the aforementioned queues.
As you can see, it respects the serial nature of the named queues, the unnamed queue is concurrent, and all these queues are concurrent with respect to each other, but the barrier is a barrier across all the queues.
class MyDispatchQueue: ConcurrentQueueWithSerialization {
private var namedQueues = [String: DispatchQueue]()
private var queue = DispatchQueue(label: Bundle.main.bundleIdentifier! + ".target", attributes: .concurrent)
private let lock = NSLock()
private func queue(with label: String) -> DispatchQueue {
lock.lock()
defer { lock.unlock() }
if let queue = namedQueues[label] { return queue }
let queue = DispatchQueue(label: Bundle.main.bundleIdentifier! + "." + label)
namedQueues[label] = queue
return queue
}
func async(with label: String, task: #escaping () -> Void) {
queue.async {
self.queue(with: label).sync(execute: task)
}
}
func async(task: #escaping () -> Void) {
queue.async(execute: task)
}
func asyncAfter(deadline: DispatchTime, with label: String, task: #escaping () -> Void) {
queue.asyncAfter(deadline: deadline) {
self.queue(with: label).sync(execute: task)
}
}
func asyncAfter(deadline: DispatchTime, task: #escaping () -> Void) {
queue.asyncAfter(deadline: deadline, execute: task)
}
func asyncBarrier(task: #escaping () -> Void) {
queue.async(flags: .barrier, execute: task)
}
}
Note, I also synchronize access to the namedQueues array, to ensure the thread-safety of this class.

Related

How to suspend dispatch queue inside for loop?

I have play and pause button. When I pressed play button, I want to play async talking inside for loop. I used dispatch group for async method's waiting inside for loop. But I cannot achieve pause.
startStopButton.rx.tap.bind {
if self.isPaused {
self.isPaused = false
dispatchGroup.suspend()
dispatchQueue.suspend()
} else {
self.isPaused = true
self.dispatchQueue.async {
for i in 0..<self.textBlocks.count {
self.dispatchGroup.enter()
self.startTalking(string: self.textBlocks[i]) { isFinished in
self.dispatchGroup.leave()
}
self.dispatchGroup.wait()
}
}
}
}.disposed(by: disposeBag)
And i tried to do with operationqueue but still not working. It is still continue talking.
startStopButton.rx.tap.bind {
if self.isPaused {
self.isPaused = false
self.talkingQueue.isSuspended = true
self.talkingQueue.cancelAllOperations()
} else {
self.isPaused = true
self.talkingQueue.addOperation {
for i in 0..<self.textBlocks.count {
self.dispatchGroup.enter()
self.startTalking(string: self.textBlocks[i]) { isFinished in
self.dispatchGroup.leave()
}
self.dispatchGroup.wait()
}
}
}
}.disposed(by: disposeBag)
Is there any advice?
A few observations:
Pausing a group doesn’t do anything. You suspend queues, not groups.
Suspending a queue stops new items from starting on that queue, but it does not suspend anything already running on that queue. So, if you’ve added all the textBlock calls in a single dispatched item of work, then once it’s started, it won’t suspend.
So, rather than dispatching all of these text blocks to the queue as a single task, instead, submit them individually (presuming, of course, that your queue is serial). So, for example, let’s say you had a DispatchQueue:
let queue = DispatchQueue(label: "...")
And then, to queue the tasks, put the async call inside the for loop, so each text block is a separate item in your queue:
for textBlock in textBlocks {
queue.async { [weak self] in
guard let self = self else { return }
let semaphore = DispatchSemaphore(value: 0)
self.startTalking(string: textBlock) {
semaphore.signal()
}
semaphore.wait()
}
}
FYI, while dispatch groups work, a semaphore (great for coordinating a single signal with a wait) might be a more logical choice here, rather than a group (which is intended for coordinating groups of dispatched tasks).
Anyway, when you suspend that queue, the queue will be preventing from starting anything queued (but will finish the current textBlock).
Or you can use an asynchronous Operation, e.g., create your queue:
let queue: OperationQueue = {
let queue = OperationQueue()
queue.name = "..."
queue.maxConcurrentOperationCount = 1
return queue
}()
Then, again, you queue up each spoken word, each respectively a separate operation on that queue:
for textBlock in textBlocks {
queue.addOperation(TalkingOperation(string: textBlock))
}
That of course assumes you encapsulated your talking routine in an operation, e.g.:
class TalkingOperation: AsynchronousOperation {
let string: String
init(string: String) {
self.string = string
}
override func main() {
startTalking(string: string) {
self.finish()
}
}
func startTalking(string: String, completion: #escaping () -> Void) { ... }
}
I prefer this approach because
we’re not blocking any threads;
the logic for talking is nicely encapsulated in that TalkingOperation, in the spirit of the single responsibility principle; and
you can easily suspend the queue or cancel all the operations.
By the way, this is a subclass of an AsynchronousOperation, which abstracts the complexity of asynchronous operation out of the TalkingOperation class. There are many ways to do this, but here’s one random implementation. FWIW, the idea is that you define an AsynchronousOperation subclass that does all the KVO necessary for asynchronous operations outlined in the documentation, and then you can enjoy the benefits of operation queues without making each of your asynchronous operation subclasses too complicated.
For what it’s worth, if you don’t need suspend, but would be happy just canceling, the other approach is to dispatching the whole for loop as a single work item or operation, but check to see if the operation has been canceled inside the for loop:
So, define a few properties:
let queue = DispatchQueue(label: "...")
var item: DispatchWorkItem?
Then you can start the task:
item = DispatchWorkItem { [weak self] in
guard let textBlocks = self?.textBlocks else { return }
for textBlock in textBlocks where self?.item?.isCancelled == false {
let semaphore = DispatchSemaphore(value: 0)
self?.startTalking(string: textBlock) {
semaphore.signal()
}
semaphore.wait()
}
self?.item = nil
}
queue.async(execute: item!)
And then, when you want to stop it, just call item?.cancel(). You can do this same pattern with a non-asynchronous Operation, too.

Weak DispatchGroup in closures and other GCD questions

Swift closures strongly capture reference types.
DispatchGroup is a reference type.
My questions have to do with the following code:
func getUsername(onDone: #escaping (_ possUsername: String?) -> ())
{
//Post request for username that calls onDone(retrievedUsername)...
}
func getBirthdate(using username: String?, onDone: #escaping (_ possBday: String?) -> ())
{
//Post request for token that calls onDone(retrievedToken)...
}
func asyncTasksInOrder(onDone: #escaping (_ resultBDay: String?) -> ())
{
let thread = DispatchQueue(label: "my thread", qos: .userInteractive, attributes: [],
autoreleaseFrequency: .workItem, target: nil)
thread.async { [weak self, onDone] in
guard let self = self else {
onDone(nil)
return
}
let dg = DispatchGroup() //This is a reference type
var retrievedUsername: String?
var retrievedBday: String?
//Get username async first
dg.enter()
self.getUsername(onDone: {[weak dg](possUsername) in
retrievedUsername = possUsername
dg?.leave() //DG is weak here
})
dg.wait()
//Now that we've waited for the username, get bday async now
dg.enter()
self.getBirthdate(using: retrievedUsername, onDone: {[weak dg](possBday) in
retrievedBday = possBday
dg?.leave() //DG is also weak here
})
dg.wait()
//We've waited for everything, so now call the return callback
onDone(retrievedBday)
}
}
So the two closures inside of asyncTasksInOrder(onDone:) each capture dg, my DispatchGroup.
Is it even necessary to capture my dispatch group?
If I don't capture it, how would I even know I've got a retain cycle?
What if the dispatch group evaporates during one of the callback executions? Would it even evaporate since it's waiting?
Is it unnecessarily expensive to instantiate a DispatchQueue like this often (disregarding the .userInteractive)? I'm asking this particular question because spinning up threads in Android is extremely expensive (so expensive that JetBrains has dedicated lots of resources towards Kotlin coroutines).
How does dg.notify(...) play into all of this? Why even have a notify method when dg.wait() does the same thing?
I feel like my understanding of GCD is not bulletproof, so I'm asking to build some confidence. Please critique as well if there's anything to critique. The help is truly appreciated.
1) No, the dispatch group is captured implicitly. You don't even need to capture self in async because GCD closures don't cause retain cycles.
2) There is no retain cycle.
3) Actually you are misusing DispatchGroup to force an asynchronous task to become synchronous.
4) No, GCD is pretty lightweight.
5) The main purpose of DispatchGroup is to notify when all asynchronous tasks – for example in a repeat loop – are completed regardless of the order.
A better solution is to nest the asynchronous tasks. With only two tasks the pyramid of doom is manageable.
func asyncTasksInOrder(onDone: #escaping (String?) -> Void)
{
let thread = DispatchQueue(label: "my thread", qos: .userInteractive, autoreleaseFrequency: .workItem)
thread.async {
//Get username async first
self.getUsername { [weak self] possUsername in
guard let self = self else { onDone(nil); return }
//Now get bday async
self.getBirthdate(using: possUsername) { possBday in
//Now call the return callback
onDone(possBday)
}
}
}
}

How to achieve DispatchGroup functionality using OperationQueue? [duplicate]

I have an Operation subclass and Operation queue with maxConcurrentOperationCount = 1.
This performs my operations in a sequential order that i add them which is good but now i need to wait until all operations have finished before running another process.
i was trying to use notification group but as this is run in a for loop as soon as the operations have been added to the queue the notification group fires.. How do i wait for all operations to leave the queue before running another process?
for (index, _) in self.packArray.enumerated() {
myGroup.enter()
let myArrayOperation = ArrayOperation(collection: self.outerCollectionView, id: self.packArray[index].id, count: index)
myArrayOperation.name = self.packArray[index].id
downloadQueue.addOperation(myArrayOperation)
myGroup.leave()
}
myGroup.notify(queue: .main) {
// do stuff here
}
You can use operation dependencies to initiate some operation upon the completion of a series of other operations:
let queue = OperationQueue()
let completionOperation = BlockOperation {
// all done
}
for object in objects {
let operation = ...
completionOperation.addDependency(operation)
queue.addOperation(operation)
}
OperationQueue.main.addOperation(completionOperation) // or, if you don't need it on main queue, just `queue.addOperation(completionOperation)`
Or, in iOS 13 and later, you can use barriers:
let queue = OperationQueue()
for object in objects {
queue.addOperation(...)
}
queue.addBarrierBlock {
DispatchQueue.main.async {
// all done
}
}
A suitable solution is KVO
First before the loop add the observer (assuming queue is the OperationQueue instance)
queue.addObserver(self, forKeyPath:"operations", options:.new, context:nil)
Then implement
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if object as? OperationQueue == queue && keyPath == "operations" {
if queue.operations.isEmpty {
// Do something here when your queue has completed
self.queue.removeObserver(self, forKeyPath:"operations")
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
Edit:
In Swift 4 it's much easier
Declare a property:
var observation : NSKeyValueObservation?
and create the observer
observation = queue.observe(\.operationCount, options: [.new]) { [unowned self] (queue, change) in
if change.newValue! == 0 {
// Do something here when your queue has completed
self.observation = nil
}
}
Since iOS13 and macOS15 operationCount is deprecated. The replacement is to observe progress.completedUnitCount.
Another modern way is to use the KVO publisher of Combine
var cancellable: AnyCancellable?
cancellable = queue.publisher(for: \.progress.completedUnitCount)
.filter{$0 == queue.progress.totalUnitCount}
.sink() { _ in
print("queue finished")
self.cancellable = nil
}
I use the next solution:
private let queue = OperationQueue()
private func addOperations(_ operations: [Operation], completionHandler: #escaping () -> ()) {
DispatchQueue.global().async { [unowned self] in
self.queue.addOperations(operations, waitUntilFinished: true)
DispatchQueue.main.async(execute: completionHandler)
}
}
Set the maximum number of concurrent operations to 1
operationQueue.maxConcurrentOperationCount = 1
then each operation will be executed in order (as if each was dependent on the previous one) and your completion operation will execute at the end.
Code at the end of the queue
refer to this link
NSOperation and NSOperationQueue are great and useful Foundation framework tools for asynchronous tasks. One thing puzzled me though: How can I run code after all my queue operations finish? The simple answer is: use dependencies between operations in the queue (unique feature of NSOperation). It's just 5 lines of code solution.
NSOperation dependency trick
with Swift it is just easy to implement as this:
extension Array where Element: NSOperation {
/// Execute block after all operations from the array.
func onFinish(block: () -> Void) {
let doneOperation = NSBlockOperation(block: block)
self.forEach { [unowned doneOperation] in doneOperation.addDependency($0) }
NSOperationQueue().addOperation(doneOperation)
}}
My solution is similar to that of https://stackoverflow.com/a/42496559/452115, but I don't add the completionOperation in the main OperationQueue but into the queue itself. This works for me:
var a = [Int](repeating: 0, count: 10)
let queue = OperationQueue()
let completionOperation = BlockOperation {
print(a)
}
queue.maxConcurrentOperationCount = 2
for i in 0...9 {
let operation = BlockOperation {
a[i] = 1
}
completionOperation.addDependency(operation)
queue.addOperation(operation)
}
queue.addOperation(completionOperation)
print("Done 🎉")

Pooling completion handlers together such that method completes once multiple closures are executed

I have a scenario where I want to perform three distinct asynchronous tasks in parallel. Once all three tasks are complete, I then want the calling method to be aware of this and to call its own completion handler.
Below is a very simplified version of the logic for this:
class ViewController: UIViewController {
func doTasks(with object: Object, completionHandler: () -> Void) {
// Once A, B & C are done, then perform a task
wrapupTask()
// When task is complete, call completionHandler
completionHandler()
}
}
fileprivate extension ViewController {
func taskA(with object: Object, completionHandler: () -> Void) {
// Do something
completionHandler()
}
func taskB(with object: Object, completionHandler: () -> Void) {
// Do something
completionHandler()
}
func taskC(with object: Object, completionHandler: () -> Void) {
// Do something
completionHandler()
}
}
I could easily chain the handlers together, but then the task will likely take longer and the code will suck.
Another item I considered was a simple counter that incremented each time a task completed, and then once it hit 3, would then call the wrapupTask() via something like this:
var count: Int {
didSet {
if count == 3 {
wrapupTask()
}
}
}
Another option I have considered is to create an operation queue, and to then load the tasks into it, with a dependency for when to run my wrap up task. Once the queue is empty, it will then call the completion handler. However, this seems like more work than I'd prefer for what I want to accomplish.
My hope is that there is something better that I am just missing.
Just to pick up on what OOPer said, you are looking for DispatchGroup. In the following, the calls to taskA, taskB, and taskC are pseudo-code, but everything else is real:
func doTasks(with object: Object, completionHandler: () -> Void) {
let group = DispatchGroup()
group.enter()
taskA() {
// completion handler
group.leave()
}
group.enter()
taskB() {
// completion handler
group.leave()
}
group.enter()
taskC() {
// completion handler
group.leave()
}
group.notify(queue: DispatchQueue.main) {
// this won't happen until all three tasks have finished their completion handlers
completionHandler()
}
}
Every enter is matched by a leave at the end of the asynchronous completion handler, and only when all the matches have actually executed do we proceed to the notify completion handler.

How to use background thread in swift?

How to use threading in swift?
dispatchOnMainThread:^{
NSLog(#"Block Executed On %s", dispatch_queue_get_label(dispatch_get_current_queue()));
}];
Swift 3.0+
A lot has been modernized in Swift 3.0. Running something on a background queue looks like this:
DispatchQueue.global(qos: .userInitiated).async {
print("This is run on a background queue")
DispatchQueue.main.async {
print("This is run on the main queue, after the previous code in outer block")
}
}
Swift 1.2 through 2.3
let qualityOfServiceClass = QOS_CLASS_USER_INITIATED
let backgroundQueue = dispatch_get_global_queue(qualityOfServiceClass, 0)
dispatch_async(backgroundQueue, {
print("This is run on a background queue")
dispatch_async(dispatch_get_main_queue(), { () -> Void in
print("This is run on the main queue, after the previous code in outer block")
})
})
Pre Swift 1.2 – Known issue
As of Swift 1.1 Apple didn't support the above syntax without some modifications. Passing QOS_CLASS_USER_INITIATED didn't actually work, instead use Int(QOS_CLASS_USER_INITIATED.value).
For more information see Apples documentation
Dan Beaulieu's answer in swift5 (also working since swift 3.0.1).
Swift 5.0.1
extension DispatchQueue {
static func background(delay: Double = 0.0, background: (()->Void)? = nil, completion: (() -> Void)? = nil) {
DispatchQueue.global(qos: .background).async {
background?()
if let completion = completion {
DispatchQueue.main.asyncAfter(deadline: .now() + delay, execute: {
completion()
})
}
}
}
}
Usage
DispatchQueue.background(delay: 3.0, background: {
// do something in background
}, completion: {
// when background job finishes, wait 3 seconds and do something in main thread
})
DispatchQueue.background(background: {
// do something in background
}, completion:{
// when background job finished, do something in main thread
})
DispatchQueue.background(delay: 3.0, completion:{
// do something in main thread after 3 seconds
})
The best practice is to define a reusable function that can be accessed multiple times.
REUSABLE FUNCTION:
e.g. somewhere like AppDelegate.swift as a Global Function.
func backgroundThread(_ delay: Double = 0.0, background: (() -> Void)? = nil, completion: (() -> Void)? = nil) {
dispatch_async(dispatch_get_global_queue(Int(QOS_CLASS_USER_INITIATED.value), 0)) {
background?()
let popTime = dispatch_time(DISPATCH_TIME_NOW, Int64(delay * Double(NSEC_PER_SEC)))
dispatch_after(popTime, dispatch_get_main_queue()) {
completion?()
}
}
}
Note: in Swift 2.0, replace QOS_CLASS_USER_INITIATED.value above with QOS_CLASS_USER_INITIATED.rawValue instead
USAGE:
A. To run a process in the background with a delay of 3 seconds:
backgroundThread(3.0, background: {
// Your background function here
})
B. To run a process in the background then run a completion in the foreground:
backgroundThread(background: {
// Your function here to run in the background
},
completion: {
// A function to run in the foreground when the background thread is complete
})
C. To delay by 3 seconds - note use of completion parameter without background parameter:
backgroundThread(3.0, completion: {
// Your delayed function here to be run in the foreground
})
In Swift 4.2 and Xcode 10.1
We have three types of Queues :
1. Main Queue:
Main queue is a serial queue which is created by the system and associated with the application main thread.
2. Global Queue :
Global queue is a concurrent queue which we can request with respect to the priority of the tasks.
3. Custom queues : can be created by the user. Custom concurrent queues always mapped into one of the global queues by specifying a Quality of Service property (QoS).
DispatchQueue.main//Main thread
DispatchQueue.global(qos: .userInitiated)// High Priority
DispatchQueue.global(qos: .userInteractive)//High Priority (Little Higher than userInitiated)
DispatchQueue.global(qos: .background)//Lowest Priority
DispatchQueue.global(qos: .default)//Normal Priority (after High but before Low)
DispatchQueue.global(qos: .utility)//Low Priority
DispatchQueue.global(qos: .unspecified)//Absence of Quality
These all Queues can be executed in two ways
1. Synchronous execution
2. Asynchronous execution
DispatchQueue.global(qos: .background).async {
// do your job here
DispatchQueue.main.async {
// update ui here
}
}
//Perform some task and update UI immediately.
DispatchQueue.global(qos: .userInitiated).async {
// Perform task
DispatchQueue.main.async {
// Update UI
self.tableView.reloadData()
}
}
//To call or execute function after some time
DispatchQueue.main.asyncAfter(deadline: .now() + 5.0) {
//Here call your function
}
//If you want to do changes in UI use this
DispatchQueue.main.async(execute: {
//Update UI
self.tableView.reloadData()
})
From AppCoda : https://www.appcoda.com/grand-central-dispatch/
//This will print synchronously means, it will print 1-9 & 100-109
func simpleQueues() {
let queue = DispatchQueue(label: "com.appcoda.myqueue")
queue.sync {
for i in 0..<10 {
print("🔴", i)
}
}
for i in 100..<110 {
print("Ⓜ️", i)
}
}
//This will print asynchronously
func simpleQueues() {
let queue = DispatchQueue(label: "com.appcoda.myqueue")
queue.async {
for i in 0..<10 {
print("🔴", i)
}
}
for i in 100..<110 {
print("Ⓜ️", i)
}
}
Swift 3 version
Swift 3 utilizes new DispatchQueue class to manage queues and threads. To run something on the background thread you would use:
let backgroundQueue = DispatchQueue(label: "com.app.queue", qos: .background)
backgroundQueue.async {
print("Run on background thread")
}
Or if you want something in two lines of code:
DispatchQueue.global(qos: .background).async {
print("Run on background thread")
DispatchQueue.main.async {
print("We finished that.")
// only back on the main thread, may you access UI:
label.text = "Done."
}
}
You can also get some in-depth info about GDC in Swift 3 in this tutorial.
From Jameson Quave's tutorial
Swift 2
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
//All stuff here
})
Swift 4.x
Put this in some file:
func background(work: #escaping () -> ()) {
DispatchQueue.global(qos: .userInitiated).async {
work()
}
}
func main(work: #escaping () -> ()) {
DispatchQueue.main.async {
work()
}
}
and then call it where you need:
background {
//background job
main {
//update UI (or what you need to do in main thread)
}
}
Swift 5
To make it easy, create a file "DispatchQueue+Extensions.swift" with this content :
import Foundation
typealias Dispatch = DispatchQueue
extension Dispatch {
static func background(_ task: #escaping () -> ()) {
Dispatch.global(qos: .background).async {
task()
}
}
static func main(_ task: #escaping () -> ()) {
Dispatch.main.async {
task()
}
}
}
Usage :
Dispatch.background {
// do stuff
Dispatch.main {
// update UI
}
}
You have to separate out the changes that you want to run in the background from the updates you want to run on the UI:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
// do your task
dispatch_async(dispatch_get_main_queue()) {
// update some UI
}
}
Since the OP question has already been answered above I just want to add some speed considerations:
I don't recommend running tasks with the .background thread priority especially on the iPhone X where the task seems to be allocated on the low power cores.
Here is some real data from a computationally intensive function that reads from an XML file (with buffering) and performs data interpolation:
Device name / .background / .utility / .default / .userInitiated / .userInteractive
iPhone X: 18.7s / 6.3s / 1.8s / 1.8s / 1.8s
iPhone 7: 4.6s / 3.1s / 3.0s / 2.8s / 2.6s
iPhone 5s: 7.3s / 6.1s / 4.0s / 4.0s / 3.8s
Note that the data set is not the same for all devices. It's the biggest on the iPhone X and the smallest on the iPhone 5s.
Good answers though, anyway I want to share my Object Oriented solution Up to date for swift 5.
please check it out: AsyncTask
Conceptually inspired by android's AsyncTask, I've wrote my own class in Swift
AsyncTask enables proper and easy use of the UI thread. This class allows to perform background operations and publish results on the UI thread.
Here are few usage examples
Example 1 -
AsyncTask(backgroundTask: {(p:String)->Void in//set BGParam to String and BGResult to Void
print(p);//print the value in background thread
}).execute("Hello async");//execute with value 'Hello async'
Example 2 -
let task2=AsyncTask(beforeTask: {
print("pre execution");//print 'pre execution' before backgroundTask
},backgroundTask:{(p:Int)->String in//set BGParam to Int & BGResult to String
if p>0{//check if execution value is bigger than zero
return "positive"//pass String "poitive" to afterTask
}
return "negative";//otherwise pass String "negative"
}, afterTask: {(p:String) in
print(p);//print background task result
});
task2.execute(1);//execute with value 1
It has 2 generic types:
BGParam - the type of the parameter sent to the task upon execution.
BGResult - the type of the result of the background computation.
When you create an AsyncTask you can those types to whatever you need to pass in and out of the background task, but if you don't need those types, you can mark it as unused with just setting it to: Void or with shorter syntax: ()
When an asynchronous task is executed, it goes through 3 steps:
beforeTask:()->Void invoked on the UI thread just before the task is executed.
backgroundTask: (param:BGParam)->BGResult invoked on the background thread immediately after
afterTask:(param:BGResult)->Void invoked on the UI thread with result from the background task
Multi purpose function for thread
public enum QueueType {
case Main
case Background
case LowPriority
case HighPriority
var queue: DispatchQueue {
switch self {
case .Main:
return DispatchQueue.main
case .Background:
return DispatchQueue(label: "com.app.queue",
qos: .background,
target: nil)
case .LowPriority:
return DispatchQueue.global(qos: .userInitiated)
case .HighPriority:
return DispatchQueue.global(qos: .userInitiated)
}
}
}
func performOn(_ queueType: QueueType, closure: #escaping () -> Void) {
queueType.queue.async(execute: closure)
}
Use it like :
performOn(.Background) {
//Code
}
I really like Dan Beaulieu's answer, but it doesn't work with Swift 2.2 and I think we can avoid those nasty forced unwraps!
func backgroundThread(delay: Double = 0.0, background: (() -> Void)? = nil, completion: (() -> Void)? = nil) {
dispatch_async(dispatch_get_global_queue(QOS_CLASS_USER_INITIATED, 0)) {
background?()
if let completion = completion{
let popTime = dispatch_time(DISPATCH_TIME_NOW, Int64(delay * Double(NSEC_PER_SEC)))
dispatch_after(popTime, dispatch_get_main_queue()) {
completion()
}
}
}
}
Grand Central Dispatch is used to handle multitasking in our iOS apps.
You can use this code
// Using time interval
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now()+1) {
print("Hello World")
}
// Background thread
queue.sync {
for i in 0..<10 {
print("Hello", i)
}
}
// Main thread
for i in 20..<30 {
print("Hello", i)
}
More information use this link : https://www.programminghub.us/2018/07/integrate-dispatcher-in-swift.html
Is there a drawback (when needing to launch a foreground screen afterward) to the code below?
import Foundation
import UIKit
class TestTimeDelay {
static var connected:Bool = false
static var counter:Int = 0
static func showAfterDelayControl(uiViewController:UIViewController) {
NSLog("TestTimeDelay", "showAfterDelayControl")
}
static func tryReconnect() -> Bool {
counter += 1
NSLog("TestTimeDelay", "Counter:\(counter)")
return counter > 4
}
static func waitOnConnectWithDelay(milliseconds:Int, uiViewController: UIViewController) {
DispatchQueue.global(qos: .background).async {
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + DispatchTimeInterval.milliseconds(milliseconds), execute: {
waitOnConnect(uiViewController: uiViewController)
})
}
}
static func waitOnConnect(uiViewController:UIViewController) {
connected = tryReconnect()
if connected {
showAfterDelayControl(uiViewController: uiViewController)
}
else {
waitOnConnectWithDelay(milliseconds: 200, uiViewController:uiViewController)
}
}
}
dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), {
// Conversion into base64 string
self.uploadImageString = uploadPhotoDataJPEG.base64EncodedStringWithOptions(NSDataBase64EncodingOptions.EncodingEndLineWithCarriageReturn)
})
in Swift 4.2 this works.
import Foundation
class myThread: Thread
{
override func main() {
while(true) {
print("Running in the Thread");
Thread.sleep(forTimeInterval: 4);
}
}
}
let t = myThread();
t.start();
while(true) {
print("Main Loop");
sleep(5);
}

Resources