How to call asynchronous function before event.respondWith in Service Worker? - service-worker

its not work.
self.addEventListener('fetch', async (event) => {
const url = await localforage.getItem('url'); // url change if page changed
if(event.request.url === url){
event.respondWith(handleFetch(event));
}
});
const handleFetch = async (event) => {
....
}
if i move url to inside event.responseWith. its work, but do every request, but i need only match url to fetch in service worker if not match then do nothing.
self.addEventListener('fetch', async (event) => {
event.respondWith(handleFetch(event));
});
const handleFetch = async (event) => {
const url = await localforage.getItem('url'); // url change if page changed
if(event.request.url === url){
....
}
}

This is intentional. Your decision as to whether or not to call event.respondWith() needs to be done synchronously, within the top-level execution of your fetch handler.
This allows you to do things like examine the request URL and headers synchronously, but it does preclude you from performing asynchronous lookups against things like IndexedDB.
If you can't transition your criteria to use something synchronous, then your best option is to call event.respondWith() unconditionally, and when the criteria is not met, use return fetch(event.request) to come as close as you could to the "default" behavior you'd get if you didn't respond at all. (That's basically what you're doing in the second example.)

Related

Manifest v3 extension: asynchronous event listener does not keep the service worker alive [duplicate]

I am trying to pass messages between content script and the extension
Here is what I have in content-script
chrome.runtime.sendMessage({type: "getUrls"}, function(response) {
console.log(response)
});
And in the background script I have
chrome.runtime.onMessage.addListener(
function(request, sender, sendResponse) {
if (request.type == "getUrls"){
getUrls(request, sender, sendResponse)
}
});
function getUrls(request, sender, sendResponse){
var resp = sendResponse;
$.ajax({
url: "http://localhost:3000/urls",
method: 'GET',
success: function(d){
resp({urls: d})
}
});
}
Now if I send the response before the ajax call in the getUrls function, the response is sent successfully, but in the success method of the ajax call when I send the response it doesn't send it, when I go into debugging I can see that the port is null inside the code for sendResponse function.
From the documentation for chrome.runtime.onMessage.addListener:
This function becomes invalid when the event listener returns, unless you return true from the event listener to indicate you wish to send a response asynchronously (this will keep the message channel open to the other end until sendResponse is called).
So you just need to add return true; after the call to getUrls to indicate that you'll call the response function asynchronously.
The accepted answer is correct, I just wanted to add sample code that simplifies this.
The problem is that the API (in my view) is not well designed because it forces us developers to know if a particular message will be handled async or not. If you handle many different messages this becomes an impossible task because you never know if deep down some function a passed-in sendResponse will be called async or not.
Consider this:
chrome.extension.onMessage.addListener(function (request, sender, sendResponseParam) {
if (request.method == "method1") {
handleMethod1(sendResponse);
}
How can I know if deep down handleMethod1 the call will be async or not? How can someone that modifies handleMethod1 knows that it will break a caller by introducing something async?
My solution is this:
chrome.extension.onMessage.addListener(function (request, sender, sendResponseParam) {
var responseStatus = { bCalled: false };
function sendResponse(obj) { //dummy wrapper to deal with exceptions and detect async
try {
sendResponseParam(obj);
} catch (e) {
//error handling
}
responseStatus.bCalled= true;
}
if (request.method == "method1") {
handleMethod1(sendResponse);
}
else if (request.method == "method2") {
handleMethod2(sendResponse);
}
...
if (!responseStatus.bCalled) { //if its set, the call wasn't async, else it is.
return true;
}
});
This automatically handles the return value, regardless of how you choose to handle the message. Note that this assumes that you never forget to call the response function. Also note that chromium could have automated this for us, I don't see why they didn't.
You can use my library https://github.com/lawlietmester/webextension to make this work in both Chrome and FF with Firefox way without callbacks.
Your code will look like:
Browser.runtime.onMessage.addListener( request => new Promise( resolve => {
if( !request || typeof request !== 'object' || request.type !== "getUrls" ) return;
$.ajax({
'url': "http://localhost:3000/urls",
'method': 'GET'
}).then( urls => { resolve({ urls }); });
}) );

How can the electron framework enable communication like an Ajax request?

ipcRenderer.sendSync may block the whole renderer process. ipcRenderer.send need use ipcRenderer.on to listen for the asynchronous return of events.
So is there a way of communicating that data can be returned as a callback directly where it was requested?
It might look something like this: ipcRenderer.sendAsync('eventName', args, callback), Or by other means.
ipcRenderer.on("onMessage", (e, {cbName, data}) => {
switch (cbName) {
case 'foo1':
foo1(data)
break
case 'foo2':
foo2(data)
break
case 'foo3':
foo2(data)
break
// more
default:
break
}
})
ipcRenderer.send("message", { cbName, /* other args */ })
what you need is ipcMain.handle() and ipcRenderer.invoke()
this will return a promise back to renderer.
// Main process
ipcMain.handle('my-invokable-ipc', async (event, ...args) => {
const result = await somePromise(...args)
return result
})
// Renderer process
async () => {
const result = await ipcRenderer.invoke('my-invokable-ipc', arg1, arg2)
// ...
}
For further info
https://www.electronjs.org/docs/api/ipc-main#ipcmainhandlechannel-listener
What you're asking for, if I understand it correctly, is to pass a callback from the renderer process to the main process and have the handler in main call into it with the response rather than sending it back.
so rather than doing this:
main.js:
ipcMain.on("message", (e, arg) => {
e.sender.send("onMessage", "response");
});
renderer.js:
ipcRenderer.send("message", 1);
ipcRenderer.on("onMessage", (e, response) => { });
you want to do this:
main.js:
ipcMain.on("message", (e, arg, callback) => {
callback("response");
});
renderer.js:
function callback(response) { }
ipcRenderer.send("message", 1, callback);
Then no, you can't do that because you can't pass functions across processes like this. Even if you .toString() the function and then recreate the function in the main process via new Function(...), you would be executing it in the context of the main process, not the renderer, and presumably you'd want to execute it in the renderer process.
Using e.sender.send(...) is the idiomatic way of sending messages back to the other process, and you shouldn't shy away from it.

Service Workers and IndexedDB

In a simple JavaScript Service Worker I want to intercept a request and read a value from IndexedDB before the event.respondWith
But the asynchronous nature of IndexDB does not seem to allow this.
Since the indexedDB.open is asynchronous, we have to await it which is fine. However, the callback (onsuccess) happens later so the function will exit immediately after the await on open.
The only way I have found to get it to work reliably is to add:
var wait = ms => new Promise((r, j) => setTimeout(r, ms));
await wait(50)
at the end of my readDB function to force a wait until the onsuccess has completed.
This is completely stupid!
And please don't even try to tell me about promises. They DO NOT WORK in this circumstance.
Does anyone know how we are supposed to use this properly?
Sample readDB is here (all error checking removed for clarity). Note, we cannot use await inside the onsuccess so the two inner IndexedDB calls are not awaited!
async function readDB(dbname, storeName, id) {
var result;
var request = await indexedDB.open(dbname, 1); //indexedDB.open is an asynchronous function
request.onsuccess = function (event) {
let db = event.target.result;
var transaction = db.transaction([storeName], "readonly"); //This is also asynchronous and needs await
var store = transaction.objectStore(storeName);
var objectStoreRequest = store.get(id); //This is also asynchronous and needs await
objectStoreRequest.onsuccess = function (event) {
result = objectStoreRequest.result;
};
};
//Without this wait, this function returns BEFORE the onsuccess has completed
console.warn('ABOUT TO WAIT');
var wait = ms => new Promise((r, j) => setTimeout(r, ms));
await wait(50)
console.warn('WAIT DONE');
return result;
}
And please don't even try to tell me about promises. They DO NOT WORK in this circumstance.
...
...
...
I mean, they do, though. Assuming that you're okay putting the promise-based IndexedDB lookups inside of event.respondWith() rather than before event.respondWith(), at least. (If you're trying to do this before calling event.respondWith(), to figure out whether or not you want to respond at all, you're correct in that it's not possible, since the decision as to whether or not to call event.respondWith() needs to be made synchronously.)
It's not easy to wrap IndexedDB in a promise-based interface, but https://github.com/jakearchibald/idb has already done the hard work, and it works quite well inside of a service worker. Moreover, https://github.com/jakearchibald/idb-keyval makes it even easier to do this sort of thing if you just need a single key/value pair, rather than the full IndexedDB feature set.
Here's an example, assuming you're okay with idb-keyval:
importScripts('https://cdn.jsdelivr.net/npm/idb-keyval#3/dist/idb-keyval-iife.min.js');
// Call idbKeyval.set() to save data to your datastore in the `install` handler,
// in the context of your `window`, etc.
self.addEventListener('fetch', event => {
// Optionally, add in some *synchronous* criteria here that examines event.request
// and only calls event.respondWith() if this `fetch` handler can respond.
event.respondWith(async function() {
const id = someLogicToCalculateAnId();
const value = await idbKeyval.get(id);
// You now can use `value` however you want.
const response = generateResponseFromValue(value);
return response;
}())
});

Ignore ajax requests in service worker

I have an app with a basic 'shell' of HTML, CSS and JS. The main content of the page is loaded via multiple ajax calls to an API that is at another URL to the one my app is running on. I have set up a service-worker to cache the main 'shell' of the application:
var urlsToCache = [
'/',
'styles/main.css',
'scripts/app.js',
'scripts/apiService.js',
'third_party/handlebars.min.js',
'third_party/handlebars-intl.min.js'
];
and to respond with the cached version when requested. The problem I am having is that the response of my ajax calls are also being cached. I'm pretty sure that I need to add some code to the fetch event of the service-worker that always get them from the network rather than looking in the cache.
Here is my fetch event:
self.addEventListener('fetch', function (event) {
// ignore anything other than GET requests
var request = event.request;
if (request.method !== 'GET') {
event.respondWith(fetch(request));
return;
}
// handle other requests
event.respondWith(
caches.open(CACHE_NAME).then(function (cache) {
return cache.match(event.request).then(function (response) {
return response || fetch(event.request).then(function (response) {
cache.put(event.request, response.clone());
return response;
});
});
})
);
});
I'm not sure how I can ignore the requests to the API. I've tried doing this:
if (request.url.indexOf(myAPIUrl !== -1) {
event.respondWith(fetch(request));
return;
}
but according to the network tab in Chrome Dev Tools, all of these responses are still coming from the service-worker.
You do not have to use event.respondWith(fetch(request)) to handle requests that you want to ignore. If you return without calling event.respondWith browser will fetch the resource for you.
You can do something like:
if (request.method !== 'GET') { return; }
if (request.url.indexOf(myAPIUrl) !== -1) { return; }
\\ handle all other requests
event.respondWith(/* return promise here */);
IOW as long as you can determine synchronously that you don't want to handle the request you can just return from the handler and let the default request processing to take over. Check out this example.

Wait while request is running

Here is a problem. When I ran these code:
String responseText = null;
HttpRequest.getString(url).then((resp) {
responseText = resp;
print(responseText);
});
print(responseText);
In console:
{"meta":{"code":200},"data":{"username":"kevin","bio":"CEO \u0026 Co-founder of Instagram","website":"","profile_picture":"http:\/\/images.ak.instagram.com\/profiles\/profile_3_75sq_1325536697.jpg","full_name":"Kevin Systrom","counts":{"media":1349,"followed_by":1110365,"follows":555},"id":"3"}}
null
It running asynchronously. Is there JAVA way with synchronized method? That will be await while request is done?
I found only one tricky way to do it and its funny -- wait for three seconds:
handleTimeout() {
print(responseText);
}
const TIMEOUT = const Duration(seconds: 3);
new Timer(TIMEOUT, handleTimeout);
And of course it works with bugs. So any suggestions?
MattB way work well:
var req = new HttpRequest();
req.onLoad.listen((e) {
responseText = req.responseText;
print(responseText);
});
req.open('GET', url, async: false);
req.send();
First, I'm assuming you're using this as a client-side script and not server-side. Using HttpRequest.getString will strictly return a Future (async method).
If you absolutely must have a synchronous request, you can construct a new HttpRequest object and call the open method passing the named parameter: async: false
var req = new HttpRequest();
req.onLoad.listen((e) => print(req.responseText));
req.open('GET', url, async: false);
req.send();
However it is highly recommended that you use async methods for accessing network resources as a synchronous call like above will cause the script to block and can potentially make it appear as though your page/script has stopped responding on poor network connections.

Resources