Is there race condition when multiple Future/Timer complete simultaneously - dart

Can there be a race condition if multiple Timer/Future complete simultaneously in Dart? For example, is there a race condition when accessing the test and test structures in the Timer complete handler in the following code?
import 'dart:async';
void main() {
Map<String, int> test = {};
List<int> test2 = [];
Timer t1 = Timer(Duration(seconds: 1), () {
test['a'] = 45;
test2.add(1);
});
Timer t2 = Timer(Duration(seconds: 1), () {
test['b'] = 67;
test2.add(2);
});
Timer t3 = Timer(Duration(seconds: 2), () {
print(test);
print(test2);
});
}
Or are Timer/Future completions processed synchronously by the main thread? Can the code within two callbacks be interwoven?

Each Dart isolate executes code in a single thread. Asynchronous code running in a single Dart isolate can run concurrently but not in parallel.
In general, if the callbacks themselves do asynchronous work, then they can be interleaved. Any await (which is equivalent to any Future.then() callback) is a point where execution returns to the event loop, interrupting your asynchronous function.
In your particular example, your callbacks are fully synchronous and cannot be interrupted. Your Timers probably will fire in a defined order since events are added to FIFO queues. However, that seems brittle, and I do not think that it would be a good idea to rely on callback ordering.
Also see: Prevent concurrent access to the same data in Dart.

Related

Swift equivalent of Ruby’s Concurrent::Event?

The popular Concurrent-Ruby library has a Concurrent::Event class that I find wonderful. It very neatly encapsulates the idea of, “Some threads need to wait for another thread to finish something before proceeding.”
It only takes three lines of code to use:
One to create the object
One to call .wait to start waiting, and
One to call .set when the thing is ready.
All the locks and booleans you’d need to use to create this out of other concurrency primitives are taken care of for you.
To quote some of the documentation, along with with a sample usage:
Old school kernel-style event reminiscent of Win32 programming in C++.
When an Event is created it is in the unset state. Threads can choose to
#wait on the event, blocking until released by another thread. When one
thread wants to alert all blocking threads it calls the #set method which
will then wake up all listeners. Once an Event has been set it remains set.
New threads calling #wait will return immediately.
require 'concurrent-ruby'
event = Concurrent::Event.new
t1 = Thread.new do
puts "t1 is waiting"
event.wait
puts "event ocurred"
end
t2 = Thread.new do
puts "t2 calling set"
event.set
end
[t1, t2].each(&:join)
which prints output like the following
t1 is waiting
t2 calling set
event occurred
(Several different orders are possible because it is multithreaded, but ‘t2 calling set’ always comes out before ‘event occurred’.)
Is there something like this in Swift on iOS?
I think the closest thing to that is the new async/await syntax in Swift 5.5. There's no equivalent of event.set, but await waits for something asynchronous to finish. A particularly nice expression of concurrency is async let, which proceeds concurrently but then lets you pause to gather up all the results of the async let calls:
async let result1 = // do something asynchronous
async let result2 = // do something else asynchronous at the same time
// ... and keep going...
// now let's gather up the results
return await (result1, result2)
You can achieve the result in your example using a Grand Central Dispatch DispatchSemaphore - This is a traditional counting semaphore. Each call to signal increments the semaphore. Each call to wait decrement the semaphore and if the result is less than zero it blocks and waits until the semaphore is 0
let semaphore = DispatchSemaphore(value: 0)
let q1 = DispatchQueue(label:"q1", target: .global(qos: .utility))
let q2 = DispatchQueue(label:"q2", target: .global(qos: .utility))
q1.async {
print("q1 is waiting")
semaphore.wait()
print("event occurred")
}
q2.async {
print("q2 calling signal")
semaphore.signal()
}
Output:
q1 is waiting
q2 calling signal
event occurred
But this object won't work if you have multiple threads that want to wait. Since each call to wait decrements the semaphore the other tasks would remain blocked.
For that you could use a DispatchGroup. You call enter before you start a task in the group and leave when it is done. You can use wait to block until the group is empty, and like your Ruby object, wait will not block if the group is already empty and multiple threads can wait on the same group.
let group = DispatchGroup()
let q1 = DispatchQueue(label:"q1", target: .global(qos: .utility))
let q2 = DispatchQueue(label:"q2", target: .global(qos: .utility))
q1.async {
print("q1 is waiting")
group.wait()
print("event occurred")
}
group.enter()
q2.async {
print("q2 calling leave")
group.leave()
}
Output:
q1 is waiting
q2 calling leave
event occurred
You generally want to avoid blocking threads on iOS if possible as there is a risk of deadlocks and if you block the main thread your whole app will become non responsive. It is more common to use notify to schedule code to execute when the group becomes empty.
I understand that your code is simply a contrived example, but depending on what you actually want to do and your minimum supported iOS requirements, there may be better alternatives.
DispatchGroup to execute code when several asynchronous tasks are complete using notify rather than wait
Combine to process asynchronous events in a pipeline (iOS 13+)
Async/Await (iOS 15+)

Are synchronous functions in Dart executed atomically?

I understand that Dart is single-threaded and that within an isolate a function call is popped from the event loop queue and executed. There seems to be two cases, async and sync.
a) Async: An asynchronous function will run without interruption until it gets to the await keyword. At this point, it may release control of the instruction pointer or continue its routine. (i.e. async functions can be but are not required to be interrupted on await)
b) Sync: All instructions from setup -> body -> and teardown are executed without interruption. If this is the case, I would say that synchronous functions are atomic.
I have an event listener that may have multiple calls in the event loop queue. I think I have two options.
Using the Synchronized package
a) Async version:
import 'package:synchronized/synchronized.dart';
final Lock _lock = Lock();
...
() async {
await _lock.synchronized(() async {
if (_condition) {
_signal.complete(status);
_condition = !_condition;
}
});
}
b) Sync version:
() {
if (_condition) {
_signal.complete(status);
_condition = !_condition;
}
}
From my understanding of the Dart concurrency model these are equivalent. I prefer b) because it is simple. However, this requires that there cannot be a race condition between two calls to my sync event handler. I have used concurrency in languages with GIL and MT but not with the event-loop paradigm.
a) Async: An asynchronous function will run without interruption until it gets to the await keyword. At this point, it may release control of the instruction pointer or continue its routine. (i.e. async functions can be but are not required to be interrupted on await)
await always yields. It's equivalent to setting up a Future.then() callback and returning to the Dart event loop.
For your simple example, there's no reason to use _lock.synchronized(). Synchronous code cannot be interrupted, and isolates (as their name imply) don't share memory. You would want some form of locking mechanism if your callback did asynchronous work and you needed to prevent concurrent asynchronous operations from being interleaved.

Timing of async callback

I want to have a better idea about the timing of the completion block from a intenet download request. In this case firebase. The following code example does not do anything, but it illustrates my questions.
Say I have 100 values in keysArray, there would be 100 async request to firebase and the completion block will be executed 100 times
func someFunction() {
for keys in keysArray {
loadDataFromFirebaseWithKey(completionHandler: { (success, data) in
print(data)
// Task A Some length for loop
for i in 0...10000 {
print("A")
}
// Task B
for i in 10001...20000 {
print("B")
}
})
// Task C
for i in 20001...30000 {
print("C")
}
// Task D
for i in 30001...40000 {
print("D")
}
}
// Task E
for i in 40001...50000 {
print("E")
}
// Task F
for i in 50001...60000 {
print("F")
}
}
The reason I am using such a big for loop is to illustrate some time consuming/non async proccess. Here are three case that I was wondering
Say if the program is half way through task C, does it finish C and also D before going into the completion block to do A and B
Say if the program is half way through task E, does it finish E and also F before going into the completion block to do A and B
If tasks are running concurrently, they may replace each other as the active thing at an opportunity, and may just proceed with genuine concurrency given that all iOS devices since the 4s have multiple cores. There's no reason that any particular for loop will be at any specific point at the time of interruption.
If Firebase schedules its completion handlers on a serial queue then none of the handlers will overlap with any other.
If Firebase schedules its completion handlers on the main queue, and you're calling it from the main queue, neither its completion handlers nor your calling code will overlap with each other.
So, directly to answer:
yes if Firebase is scheduling completion handlers on the same queue as you called from and that queue is serial — which almost always means 'yes' if everything is main queue linked. Otherwise no.
same answer. There's no special concurrency magic to for loops. They're exactly as usurpable as any other piece of code.

Swift 3.0: Does calling `sync` on a queue after calling `async` block the queue?

I was going through the revisions in the Swift document and found the following
If you need to capture and mutate an in-out parameter, use an explicit local copy, such as in multithreaded code that ensures all mutation has finished before the function returns.
func multithreadedFunction(queue: DispatchQueue, x: inout Int) {
// Make a local copy and manually copy it back.
var localX = x
defer { x = localX }
// Operate on localX asynchronously, then wait before returning.
queue.async { someMutatingOperation(&localX) }
queue.sync {}
}
I had two questions concerning this:
Does calling async and then calling sync block the queue?
Why would you call async in the first place if you wanted to wait? I always thought asynchronous tasks were to return immediately without waiting until the whole code block was executed. Shouldn't one call sync?
EDIT Added link to the document. BTW I don't think whether queue is serial or concurrent is not too relevant.

Launching multiple async futures in response to events

I would like to launch a fairly expensive operation in response to a user clicking on a canvas element.
mouseDown(MouseEvent e) {
print("entering event handler");
var future = new Future<int>(expensiveFunction);
future.then((int value) => redrawCanvas(value);
print("done event handler");
}
expensiveFunction() {
for(int i = 0; i < 1000000000; i++){
//do something insane here
}
}
redrawCanvas(int value) {
//do stuff here
print("redrawing canvas");
}
My understanding of M4 Dart, is that this future constructor should launch "expensiveFunction" asynchronously, aka on a different thread from the main one. And it does appear this way, as "done event handler" is immediately printed into my output window in the IDE, and then some time later "redrawing canvas" is printed. However, if I click on the element again nothing happens until my "expensiveFunction" is done running from the previous click.
How do I use futures to simply launch an compute intensive function on new thread such that I can have multiple of them queued up in response to multiple clicks, even if the first future is not complete yet?
Thanks.
As mentioned in a different answer, Futures are just a "placeholder for a value that is made available in the future". They don't necessarily imply concurrency.
Dart has a concept of isolates for concurrency. You can spawn an isolate to run some code in a parallel thread or process.
dart2js can compile isolates into Web Workers. A Web Worker can run in a separate thread.
Try something like this:
import 'dart:isolate';
expensiveOperation(SendPort replyTo) {
var result = doExpensiveThing(msg);
replyTo.send(result);
}
main() async {
var receive = new ReceivePort();
var isolate = await Isolate.spawn(expensiveOperation, receive.sendPort);
var result = await receive.first;
print(result);
}
(I haven't tested the above, but something like it should work.)
Event Loop & Event Queue
You should note that Futures are not threads. They do not run concurrently, and in fact, Dart is single-threaded. All Dart code runs in an event loop.
The event loop is a loop that runs as long as the current Dart isolate is alive. When you call main() to start a Dart application, the isolate is created, and it is no longer alive after the main method is completed and all items on the event queue are completed as well.
The event queue is the set of all functions that still need to finish executing. Because Dart is single threaded, all of these functions need to run one at a time. So when one item in the event queue is completed, another one begins. The exact timing and scheduling of the event queue is something that's way more complicated than I can explain myself.
Therefore, asynchronous processing is important to prevent the single thread from being blocked by some long running execution. In a UI, a long process can cause visual jankiness and hinder your app.
Futures
Futures represent a value that will be available sometime in the Future, hence the name. When a Future is created, it is returned immediately, and execution continues.
The callback associated with that Future (in your case, expensiveFunction) is "started" by being added to the event queue. When you return from the current isolate, the callback runs and as soon as it can, the code after then.
Streams
Because your Futures are by definition asynchronous, and you don't know when they return, you want to queue up your callbacks so that they remain in order.
A Stream is an object that emits events that can be subscribed to. When you write canvasElement.onClick.listen(...) you are asking for the onClick Stream of MouseEvents, which you then subscribe to with listen.
You can use Streams to queue up events and register a callback on those events to run the code you'd like.
What to Write
main() {
// Used to add events to a stream.
var controller = new StreamController<Future>();
// Pause when we get an event so that we take one value at a time.
var subscription = controller.stream.listen(
(_) => subscription.pause());
var canvas = new CanvasElement();
canvas.onClick.listen((MouseEvent e) {
print("entering event handler");
var future = new Future<int>(expensiveFunction);
// Resume subscription after our callback is called.
controller.add(future.then(redrawCanvas).then(subscription.resume()));
print("done event handler");
});
}
expensiveFunction() {
for(int i = 0; i < 1000000000; i++){
//do something insane here
}
}
redrawCanvas(int value) {
//do stuff here
print("redrawing canvas");
}
Here we are queuing up our redrawCanvas callbacks by pausing after each mouse click, and then resuming after redrawCanvas has been called.
More Information
See also this great answer to a similar question.
A great place to start reading about Dart's asynchrony is the first part of this article about the dart:io library and this article about the dart:async library.
For more information about Futures, see this article about Futures.
For Streams information, see this article about adding to Streams and this article about creating Streams.

Resources