THis is a followup from Use of futures for async loading
My WebGL/Dart program needs to create lots of opengl data while initializing. It all gets loaded asynchronously and uses futures to process data when it is loaded and to know when all the required data is loaded.
I'm having trouble loading textures though. I have this code -
ImageElement img = new ImageElement();
img.onLoad.listen((e) {
baseTexture = gl.createTexture();
gl.bindTexture(TEXTURE_2D, baseTexture);
gl.texImage2DImage(TEXTURE_2D, 0, RGBA, RGBA, UNSIGNED_BYTE, img);
gl.texParameteri(TEXTURE_2D, TEXTURE_MIN_FILTER, LINEAR);
gl.texParameteri(TEXTURE_2D, TEXTURE_MAG_FILTER, LINEAR);
});
img.src = "base.png";
This works fine. It loads the image and makes a texture from it when it arrives.
However I need my main program to know when all my textures have arrived. Based on the previous question, I should use a future for each one and the use Future.wait to wait for them all to be ready.
However loading images as above doesn't use futures, it uses a StreamSubscription so I get no future back from this function to wait for.
How can I get a future object that will let me know when my texture is creater?
Can I create my own Future object and "signal" it in the callback? If I can do that it's not at all clear to me from the documentation how I do that.
You can indeed "signal" that a future is complete manually.
Future<Results> costlyQuery() {
var completer = new Completer();
database.query("SELECT * FROM giant_table", (results) {
// when complete
completer.complete(results);
});
// this returns essentially immediately,
// before query is finished
return completer.future;
}
The future completes when completer.complete(results); is executed.
From here: http://blog.sethladd.com/2012/03/using-futures-in-dart-for-better-async.html
You can use the property Future<T> first of the Stream class.
https://api.dartlang.org/docs/channels/stable/latest/dart_async/Stream.html#first
ImageElement img = new ImageElement();
Future future = img.onLoad.first.then((e) {
...
});
img.src = "base.png";
Related
I use WebAudio API and basically my setup is fairly simple.
I use 1 AudioWorkletNode as an emitter and another one as a receiver
emitter:
process(inputs, outputs) {
inputs[ 0 ].length && this.port.postMessage( inputs[ 0 ] );
return ( true );
}
receiver:
inputs = [ new Float32Array(128), new Float32Array(128) ]
constructor() {
super();
// Create a message port to receive messages from the main thread
this.port.onmessage = (event) => {
this.inputs = event.data.inputs;
};
}
process( inputs, outputs) {
const output = outputs[0];
for (let channel = 0; channel < output.length; ++channel) {
output[ channel ].set( this.inputs[ channel ] );
}
return true;
}
on client side I have
//emitter
this.inputWorklet.port.onmessage = e => this.receiverWorklet.port.postMessage( { inputs: e.data } );
and for receiving the data I have connected the nodes together
this.receiverWorklet.connect( this.gainNode );
This works but my problem is that the sound is really glitchy
One thing I though of is there might be a delay between events also WebAudio is in a DOM context
Do you have any ideas How I could achieve a fluid stream restitution?
or maybe another technique?
The reason for the glitchy audio is that your code only works if everything always happens in the exact same order.
The input worklet's process() function needs to be called. It sends an event.
The event needs to pass through the main thread.
The event needs to arrive at the receiver worklet.
Only after that the receiver worklet's process() function needs to be called.
Since there is no buffer it always has to happen in the exact same order. If for some reason the main thread is busy and it can't process the events right away the receiver will continue playing the old audio.
I think you can almost keep the current implementation by buffering a few events in your receiver worklet before you start playing. It will of course also add some latency.
Another approach would be to use a SharedArrayBuffer instead of sending events. Your input worklet would write to the SharedArrayBuffer and your receiver worklet would read from it.
I'm using Vaadin 23 and I build an application which shows different images which are changed and loaded on runtime. I need the exact display size of the image to update other information based on the size. The server from which I get the image is quite slow, therefore it can last up to 10 seconds until the new image is downloaded by the client.
Image doesn't have a function like:
myImage.addSrcChangedAndClientUpdatedListener(...);
Is there any other solution to get a callback function in Java when the src Attribut changes and the new image is downloaded from the client?
My workaround is to wait 10 seconds and then get the image size by an async Javascript call.
But sometimes the image is ready after 2 seconds -> my implementation waits 8 seconds unnecessaryly which leads to bad customer experience.
Sometime the image download even takes 20 seconds -> my implementation doesn't work.
I don't see a proper solution. Any ideas? Can a implement a custom Java callback function based on custom Javascript code? How would such a solution look like?
Using Flow's Element API you can create a listener for the load DOM event that an img element fires after it has finished loading the image data. Using the element API you can also add several details to the event data that should be sent to the server. Here is a simple example that allows switching images, and then logs the image's size from the server:
public class ImageView extends Div {
private static final String IMAGE_1 = "https://imgs.search.brave.com/bjAqtSxNjNFNHm384o53EB6Zrv85eGtWpmspBqc98Yk/rs:fit:592:225:1/g:ce/aHR0cHM6Ly90c2Uy/Lm1tLmJpbmcubmV0/L3RoP2lkPU9JUC5X/ZzNXajk1SDl6VTNw/SzNMY2dwT2xRSGFG/NyZwaWQ9QXBp";
private static final String IMAGE_2 = "https://imgs.search.brave.com/v5uhvAoiVtMDM8UPI8vk8XMELrIRKW1fVLkRsiuqnU0/rs:fit:844:225:1/g:ce/aHR0cHM6Ly90c2Ux/Lm1tLmJpbmcubmV0/L3RoP2lkPU9JUC5O/OEV3U1psZlNZNmph/cmR1cm4xckZBSGFF/SyZwaWQ9QXBp";
public ImageView() {
Image image = new Image();
Button setImage1 = new Button("Set image 1", e -> image.setSrc(IMAGE_1));
Button setImage2 = new Button("Set image 2", e -> image.setSrc(IMAGE_2));
Span imageData = new Span();
image.getElement().addEventListener("load", loadEvent -> {
JsonObject eventData = loadEvent.getEventData();
Number width = eventData.getNumber("element.clientWidth");
Number height = eventData.getNumber("element.clientHeight");
imageData.setText(String.format("Width: %s | Height: %s", width, height));
})
.addEventData("element.clientWidth")
.addEventData("element.clientHeight");
add(image);
add(new Div(imageData));
add(new Div(setImage1, setImage2));
}
}
Reference:
Flow Element API - Listening to User Events
Window: load event
I have a CameraPreview that fills the whole screen, with a FloatingActionButton at the bottom to take a picture.
On the onPress method of the button, I'm making a network call for which I do not care (yet) about the result. So I would like everything made inside that method to be done asynchronously, so it does not block my main thread.
That means (if I get it right) that I sould not use the await keyword.
This is the code in my onPressed
// Attempt to take a picture and log where it's been saved
await controller.takePicture(path);
print("Done taking picture");
sendBase64ToAPI(path);
This is my sendBase64ToApi method
Future<String> sendBase64ToAPI(String path) async {
File(path).readAsBytes().then(thenMethod);
return null;
}
void thenMethod(List bytes){
print("Start reading file");
Image image = decodeImage(bytes);
int x = ((screenWidth/2) + (overlayWidth/2)).toInt();
int y = ((screenHeight/2) + (overlayHeight/2)).toInt();
print("Start cropping image");
image = copyCrop(image, x, y, overlayWidth, overlayHeight);
var base64Str = base64.encode(image.getBytes());
print("Done");
print(base64Str.substring(0,30));
print(base64Str.substring(base64Str.length-30,base64Str.length-1));
}
My UI is completely frozen between 'Start reading file' and 'Start cropping image' although, those are async methods, called without await so that shouldn't happen.
Why are those methods not executing asynchronously ?
Okay, this seems to be a known issue from the library I'm using.
The documentation recommends to use the decodeImage function in an Isolate.
Will keep the question open for a few days, if someone spots what is synchronous in my code.
I'm attempting to make a sample application for ios using Actionscript (adobe air). But I'm having problems concerning events/event handling. My app basically gives the user the option to take a picture or select one from the camera roll to upload to a server. If the user decides to take a photo, I have to save that photo to the device's camera roll. The part of the code I'm having problem with is below:
private function readMediaData():void {
//set up some variables and data
var file:File = tempDir.resolvePath(filename);
var stream:FileStream = new FileStream();
stream.open(file, FileMode.WRITE);
stream.writeBytes(imageBytes);
stream.close();
file.addEventListener(Event.COMPLETE, uploadComplete, false, 0, true);
//upload file here
}
private function uploadComplete(event:Event):void {
//selectedImage is the MediaPromise
if (selectedImage.file == null) {
loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, loaderCompleted);
loader.loadFilePromise(selectedImage);
}
}
private function loaderCompleted(event:Event):void {
//save image
}
The upload is working fine, but once the upload is completed, I get a somewhat infinite loop between loaderCompleted and uploadComplete resulting in multiple images being uploaded to the server. I tried removing the listener for the file once it has entered the uploadComplete function but still get the same result. My guess is that once the event listener for the loader is registered, it triggers an Event.COMPLETE notification which both method (or object) still recognizes. Is there a way to properly handle events with the same type but coming from different objects and with different function listeners?
Try setting the listener to the stream instead of to the file:
// Changed the 'file' with 'stream'
stream.addEventListener(Event.COMPLETE, uploadComplete, false, 0, true);
I write a Mozilla Jetpack based add-on that has to run whenever a document is loaded. For "toplevel documents" this mostly works using this code (OserverService = require('observer-service')):
this.endDocumentLoadCallback = function (subject, data) {
console.log('loaded: '+subject.location);
try {
server.onEndDocumentLoad(subject);
}
catch (e) {
console.error(formatTraceback(e));
}
};
ObserverService.add("EndDocumentLoad", this.endDocumentLoadCallback);
But the callback doesn't get called when the user opens a new tab using middle click or (more importantly!) for frames. And even this topic I only got through reading the source of another extension and not through the documentation.
So how do I register a callback that really gets called every time a document is loaded?
Edit: This seems to do what I want:
function callback (event) {
// this is the content document of the loaded page.
var doc = event.originalTarget;
if (doc instanceof Ci.nsIDOMNSHTMLDocument) {
// is this an inner frame?
if (doc.defaultView.frameElement) {
// Frame within a tab was loaded.
console.log('!!! loaded frame:',doc.location.href);
}
else {
console.log('!!! loaded top level document:',doc.location.href);
}
}
}
var wm = Cc["#mozilla.org/appshell/window-mediator;1"].getService(Ci.nsIWindowMediator);
var mainWindow = wm.getMostRecentWindow("navigator:browser");
mainWindow.gBrowser.addEventListener("load", callback, true);
Got it partially from here: https://developer.mozilla.org/en/XUL_School/Intercepting_Page_Loads
#kizzx2 you are better served with #jetpack
To the original question: why don't you use tab-browser module. Something like this:
var browser = require("tab-browser");
exports.main = function main(options, callbacks) {
initialize(function (config) {
browser.whenContentLoaded(
function(window) {
// something to do with the window
// e.g., if (window.locations.href === "something")
}
);
});
Much cleaner than what you do IMHO and (until we have official pageMods module) the supported way how to do this.
As of Addon SDK 1.0, the proper way to do this is to use the page-mod module.
(Under the hood it's implemented using the document-element-inserted observer service notification, you can use it in a regular extension or if page-mod doesn't suit you.)