how to use suspendCoroutine to turn java 7 future into kotlin suspending function - future

What is the best approach to wrap java 7 futures inside a kotlin suspend function?
Is there a way to convert a method returning Java 7 futures into a suspending function?
The process is pretty straightforward for arbitrary callbacks or java 8 completablefutures, as illustrated for example here:
* https://github.com/Kotlin/kotlin-coroutines/blob/master/kotlin-coroutines-informal.md#suspending-functions
In these cases, there is a hook that is triggered when the future is done, so it can be used to resume the continuation as soon as the value of the future is ready (or an exception is triggered).
Java 7 futures however don't expose a method that is invoked when the computation is over.
Converting a Java 7 future to a Java 8 completable future is not an option in my codebase.
Of course, i can create a suspend function that calls future.get() but this would be blocking, which breaks the overall purpose of using coroutine suspension.
Another option would be to submit a runnable to a new thread executor, and inside the runnable call future.get() and invoke a callback. This wrapper will make the code looks like "non-blocking" from the consumer point of view, the coroutine can suspend, but under the hood we are still writing blocking code and we are creating a new thread just for the sake of blocking it

Java 7 future is blocking. It is not designed for asynchronous APIs and does not provide any way to install a callback that is invoked when the future is complete. It means that there is no direct way to use suspendCoroutine with it, because suspendCoroutine is designed for use with asynchronous callback-using APIs.
However, if your code is, in fact, running under JDK 8 or a newer version, there are high chances that the actual Future instance that you have in your code happens to implement CompletionStage interface at run-time. You can try to cast it to CompletionStage and use ready-to-use CompletionStage.await extension from kotlinx-coroutines-jdk8 module of kotlinx.coroutines library.

Of course Roman is right that a Java Future does not let you provide a callback for when the work is done.
However, it does give you a way to check if the work is done, and if it is, then calling .get() won't block.
Luckily for us, we also have a cheap way to divert a thread to quickly do a poll check via coroutines.
Let's write that polling logic and also vend it as an extension method:
suspend fun <T> Future<T>.wait(): T {
while(!isDone)
delay(1) // or whatever you want your polling frequency to be
return get()
}
Then to use:
fun someBlockingWork(): Future<String> { ... }
suspend fun useWork() {
val result = someBlockingWork().wait()
println("Result: $result")
}
So we have millisecond-response time to our Futures completing without using any extra threads.
And of course you'll want to add some upper bound to use as a timeout so you don't end up waiting forever. In that case, we can update the code just a little:
suspend fun <T> Future<T>.wait(timeoutMs: Int = 60000): T? {
val start = System.currentTimeMillis()
while (!isDone) {
if (System.currentTimeMillis() - start > timeoutMs)
return null
delay(1)
}
return get()
}

You should be now be able to do this by creating another coroutine in the same scope that cancels the Future when the coroutine is cancelled.
withContext(Dispatchers.IO) {
val future = getSomeFuture()
coroutineScope {
val cancelJob = launch {
suspendCancellableCoroutine<Unit> { cont ->
cont.invokeOnCancellation {
future.cancel(true)
}
}
}
future.get().also {
cancelJob.cancel()
}
}
}

Related

How to restore runOn Scheduler used in previous operator?

Folks, is it possible to obtain currently used Scheduler within an operator?
The problem that I have is that Mono.fromFuture() is being executed on a native thread (AWS CRT Http Client in my case). As result all subsequent operators are also executed on that thread. And later code wants to obtain class loader context that is obviously null. I realize that I can call .publishOn(originalScheduler) after .fromFuture() but I don't know what scheduler is used to materialize Mono returned by my function.
Is there elegant way to deal with this?
fun myFunction(): Mono<String> {
return Mono.just("example")
.flatMap { value ->
Mono.fromFuture {
// invocation of 3rd party library that executes Future on the thread created in native code.
}
}
.map {
val resource = Thread.currentThread().getContextClassLoader().getResources("META-INF/services/blah_blah");
// NullPointerException because Thread.currentThread().getContextClassLoader() returns NULL
resource.asSequence().first().toString()
}
}
It is not possible, because there's no guarantee that there is a Scheduler at all.
The place where the subscription is made and the data starts flowing could simply be a Thread. There is no mechanism in Java that allows an external actor to submit a task to an arbitrary thread (you have to provide the Runnable at Thread construction).
So no, there's no way of "returning to the previous Scheduler".
Usually, this shouldn't be an issue at all. If your your code is reactive it should also be non-blocking and thus able to "share" whichever thread it currently runs on with other computations.
If your code is blocking, it should off-load the work to a blocking-compatible Scheduler anyway, which you should explicitly chose. Typically: publishOn(Schedulers.boundedElastic()). This is also true for CPU-intensive tasks btw.

Are synchronous functions in Dart executed atomically?

I understand that Dart is single-threaded and that within an isolate a function call is popped from the event loop queue and executed. There seems to be two cases, async and sync.
a) Async: An asynchronous function will run without interruption until it gets to the await keyword. At this point, it may release control of the instruction pointer or continue its routine. (i.e. async functions can be but are not required to be interrupted on await)
b) Sync: All instructions from setup -> body -> and teardown are executed without interruption. If this is the case, I would say that synchronous functions are atomic.
I have an event listener that may have multiple calls in the event loop queue. I think I have two options.
Using the Synchronized package
a) Async version:
import 'package:synchronized/synchronized.dart';
final Lock _lock = Lock();
...
() async {
await _lock.synchronized(() async {
if (_condition) {
_signal.complete(status);
_condition = !_condition;
}
});
}
b) Sync version:
() {
if (_condition) {
_signal.complete(status);
_condition = !_condition;
}
}
From my understanding of the Dart concurrency model these are equivalent. I prefer b) because it is simple. However, this requires that there cannot be a race condition between two calls to my sync event handler. I have used concurrency in languages with GIL and MT but not with the event-loop paradigm.
a) Async: An asynchronous function will run without interruption until it gets to the await keyword. At this point, it may release control of the instruction pointer or continue its routine. (i.e. async functions can be but are not required to be interrupted on await)
await always yields. It's equivalent to setting up a Future.then() callback and returning to the Dart event loop.
For your simple example, there's no reason to use _lock.synchronized(). Synchronous code cannot be interrupted, and isolates (as their name imply) don't share memory. You would want some form of locking mechanism if your callback did asynchronous work and you needed to prevent concurrent asynchronous operations from being interleaved.

Replace Future.then() with async/await

I've always considered async/await more elegant/sexy over the Futures API, but now I'm faced with a situation where the Future API implementation is very short and concise and the async/await alternative seems verbose and ugly.
I marked my two question #1 and #2 in the comments:
class ItemsRepository
{
Future<dynamic> item_int2string;
ItemsRepository() {
// #1
item_int2string =
rootBundle.loadString('assets/data/item_int2string.json').then(jsonDecode);
}
Future<String> getItem(String id) async {
// #2
return await item_int2string[id];
}
}
#1: How do I use async/await here instead of Future.then()? What's the most elegant solution?
#2: Is this efficient if the method is called a lot? How much overhead does await add? Should I make the resolved future an instance variable, aka
completedFuture ??= await item_int2string;
return completedFuture[id];
1: How do I use async/await here instead of Future.then()? What's the most elegant solution?
async methods are contagious. That means your ItemsRepository method has to be async in order to use await inside. This also means you have to call it asynchronously from other places. See example:
Future<dynamic> ItemsRepository() async {
// #1
myString = await rootBundle.loadString('assets/data/item_int2string.json');
// do something with my string here, which is not in a Future anymore...
}
Note that using .then is absolutely the same as await in a async function. It is just syntactic sugar. Note that you would use .then differently than in your example though:
ItemsRepository() {
// #1
rootBundle.loadString('assets/data/item_int2string.json').then((String myString) {
// do something with myString here, which is not in a Future anymore...
});
}
And for #2 don't worry about a performance impact of async code. The code will be executed at the same speed as synchronous code, just later whenever the callback happens. The only reason async exists is for having an easy way of allowing code to continue running while the system waits for the return of the asynchronously called portion. For example not block the UI while waiting for the disk to load a file.
I recommend you read the basic docs about async in Dart.
then and await are different. await will stop the program there until the Future task is finished. However then will not block the program. The block within then will be executed when the Future task is finished afterwards.
If you want your program to wait for the Future task, then use await. If you want your program to continue running and the Future task do it things "in the background", then use then.

Stop arbitrary function execution

For the purposes of this question, assume that I need to run some function on some object and that function will take a long time to execute (minutes). Also assume that I have no control over this function (*). How do I now cancel this function's execution?
I want to run it in a background thread to keep the main thread free and I could do that with GCD, NSOperation or NSThread. However, as far as I know, none of these support forced stopping. They can all be cancelled, but this cancellation must be implemented in the function itself - but I don't have access to that function, so I can't do that. The closest I got was using NSThread and exit(), but unfortunately it can't be applied to a instance variable (see the code example). My current plan is to try to send a notification and observe that within the object/function and kill the thread from within using Thread.exit(). I'm justing wondering if there is a "cleaner" or easier way, either built-in or 3rd party.
let someObject = Object()
// Using GCD
dispatchQueue.async { someObject.expensiveFunction() }
// Using NSOperation
operationQueue.addOperation { someObject.expensiveFunction() }
// Using NSThread
let thread = Thread { someObject.expensiveFunction() }
thread.exit() // exit is not available on an instance
(*) In this case I do have control over the function and could implement an actual cancellation, but due to the libraries I'm using, this would require a lot of refactoring.

How to do multithreading, concurrency or parallelism in iOS Swift?

Is there any way to create a worker thread in Swift?, for example, if there's a major functionality that requires a lot of calculations and hence causes the main thread to delay for a few seconds, if I would like to move that functionality to a separate thread or a thread that do not block the main thread is there any way to do it with Swift?
I've gone through the basic and advanced components of the Apple Documentation for Swift but there's nothing about concurrency or parallelism, do anyone know something about how to do it(if possible)?
Or you can use operation queues, too. In Swift 3:
let queue = OperationQueue()
queue.addOperation() {
// do something in the background
OperationQueue.main.addOperation() {
// when done, update your UI and/or model on the main queue
}
}
Either this, or GCD, which Andy illustrated, work fine.
See Apple's Concurrency Programming Guide for the relative merits of operation queues and dispatch queues (aka Grand Central Dispatch, GCD). While that guide is still illustrating the examples using Objective-C, the API and concepts are basically the same in Swift (just use the Swift syntax). The documentation for GCD and operation queues in Xcode describes both Objective-C and Swift APIs.
By the way, you'll notice that in both the above example as well as Andy's GCD demonstration, we used "trailing closures". For example, if you look at the definition addOperationWithBlock, that is defined as a function with one parameter which is a "closure" (which is analogous to a block in Objective-C):
func addOperation(_ block: #escaping () -> Swift.Void)
That might lead you to assume that you would invoke it as follows:
queue.addOperation({
// do something in the background
})
But when the last parameter of a function is a closure, the trailing closure syntax allows you to take that final closure parameter out of the parentheses of the function, and move it after the function, yielding:
queue.addOperation() {
// do something in the background
}
And because there's nothing left in the parentheses, you can even go one step further, and remove those empty parentheses:
queue.addOperation {
// do something in the background
}
Hopefully that illustrates how to interpret the NSOperationQueue/OperationQueue and/or GCD function declarations and use them in your code.
You can use Grand Central Dispatch (GCD) for such tasks.
This is a basic example:
let backgroundQueue: dispatch_queue_t = dispatch_queue_create("com.a.identifier", DISPATCH_QUEUE_CONCURRENT)
// can be called as often as needed
dispatch_async(backgroundQueue) {
// do calculations
}
// release queue when you are done with all the work
dispatch_release(backgroundQueue)
This library lets you describe concurrency in a super expressive way:
func handleError(_ error) { ... }
HoneyBee.start(on: DispatchQueue.main) { root in
root.setErrorHandler(handleError)
.chain(function1) // runs on main queue
.setBlockPerformer(DispatchQueue.global())
.chain(function2) // runs on background queue
.branch { stem in
stem.chain(func3) // runs in parallel with func4
+
stem.chain(func4) // runs in parallel with func3
}
.chain(func5) // runs after func3 and func4 have finished
.setBlockPerformer(DispatchQueue.main)
.chain(updateUIFunc)
}
Here is the best resource to learn in detail about
Councurrency

Resources