What's the right way to implement offline fallback with workbox - service-worker

I am implementing PWA into my project, I have setted up the serviceworker.js, and I am using workbox.js for cache routing and strategies.
1- I add the offline page to cache on install event, when a user first visit the site:
/**
* Add on install
*/
self.addEventListener('install', (event) => {
const urls = ['/offline/'];
const cacheName = workbox.core.cacheNames.runtime;
event.waitUntil(caches.open(cacheName).then((cache) => cache.addAll(urls)))
});
2- Catch & cache pages with a specific regex, like these:
https://website.com/posts/the-first-post
https://website.com/posts/
https://website.com/articles/
workbox.routing.registerRoute(
new RegExp('/posts|/articles'),
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages-cache'
})
);
3- Catch errors and display the offline page, when there's no internet connection.
/**
* Handling Offline Page fallback
*/
this.addEventListener('fetch', event => {
if (event.request.mode === 'navigate' || (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html'))) {
event.respondWith(
fetch(event.request.url).catch(error => {
// Return the offline page
return caches.match('/offline/');
})
);
}
else{
// Respond with everything else if we can
event.respondWith(caches.match(event.request)
.then(function (response) {
return response || fetch(event.request);
})
);
}
});
Now this is working for me so far if I visit for example: https://website.com/contact-us/ but if I visit any url within the scope I defined earlier for "pages-cache" like https://website.com/articles/231/ this would not return the /offline page since it's not in the user cache, and I would get a regular browser error.
There's an issue in how errors are handled, when there's a specific caching route by workbox.
Is this the best method to apply for offline fallback? how can I catch errors from these paths: '/articles' & '/posts' and display an offline page?
Please refer as well to this answer where there's a different
approach to applying the fallack with workbox, I tried it as well same
results. Not sure which is the accurate approach for this.

I found a way to do it right with workbox.
For each route I would add a fallback method like this:
const offlinePage = '/offline/';
/**
* Pages to cache
*/
workbox.routing.registerRoute(/\/posts.|\/articles/,
async ({event}) => {
try {
return await workbox.strategies.staleWhileRevalidate({
cacheName: 'cache-pages'
}).handle({event});
} catch (error) {
return caches.match(offlinePage);
}
}
);
In case of using network first strategy this is the method:
/**
* Pages to cache (networkFirst)
*/
var networkFirst = workbox.strategies.networkFirst({
cacheName: 'cache-pages'
});
const customHandler = async (args) => {
try {
const response = await networkFirst.handle(args);
return response || await caches.match(offlinePage);
} catch (error) {
return await caches.match(offlinePage);
}
};
workbox.routing.registerRoute(
/\/posts.|\/articles/,
customHandler
);
More details at workbox documentation here: Provide a fallback response to a route

Related

Is there any way to track an event using firebase in electron + react

I want to ask about how to send an event using firebase & electron.js. A friend of mine has a problem when using firebase analytics and electron that it seems the electron doesn't send any event to the debugger console. When I see the network it seems the function doesn't send anything but the text successfully go in console. can someone help me to figure it? any workaround way will do, since he said he try to implement the solution in this topic
firebase-analytics-log-event-not-working-in-production-build-of-electron
electron-google-analytics
this is the error I got when Try to use A solution in Point 2
For information, my friend used this for the boiler plate electron-react-boilerplate
The solution above still failed. Can someone help me to solve this?
EDIT 1:
As you can see in the image above, the first image is my friend's code when you run it, it will give a very basic example like in the image 2 with a button to send an event.
ah just for information He used this firebase package :
https://www.npmjs.com/package/firebase
You can intercept HTTP protocol and handle your static content though the provided methods, it would allow you to use http:// protocol for the content URLs. What should make Firebase Analytics work as provided in the first question.
References
Protocol interception documentation.
Example
This is an example of how you can serve local app as loaded by HTTP protocol and simulate regular browser work to use http protocol with bundled web application. This will allow you to add Firebase Analytics. It supports poorly HTTP data upload, but you can do it on your own depending on the goals.
index.js
const {app, BrowserWindow, protocol} = require('electron')
const http = require('http')
const {createReadStream, promises: fs} = require('fs')
const path = require('path')
const {PassThrough} = require('stream')
const mime = require('mime')
const MY_HOST = 'somehostname.example'
app.whenReady()
.then(async () => {
await protocol.interceptStreamProtocol('http', (request, callback) => {
const url = new URL(request.url)
const {hostname} = url
const isLocal = hostname === MY_HOST
if (isLocal) {
serveLocalSite({...request, url}, callback)
}
else {
serveRegularSite({...request, url}, callback)
}
})
const win = new BrowserWindow()
win.loadURL(`http://${MY_HOST}/index.html`)
})
.catch((error) => {
console.error(error)
app.exit(1)
})
async function serveLocalSite(request, callback) {
try {
const {pathname} = request.url
const filepath = path.join(__dirname, path.resolve('/', pathname))
const stat = await fs.stat(filepath)
if (stat.isFile() !== true) {
throw new Error('Not a file')
}
callback(
createResponse(
200,
{
'content-type': mime.getType(path.extname(pathname)),
'content-length': stat.size,
},
createReadStream(filepath)
)
)
}
catch (err) {
callback(
errorResponse(err)
)
}
}
function serveRegularSite(request, callback) {
try {
console.log(request)
const req = http.request({
url: request.url,
host: request.url.host,
port: request.url.port,
method: request.method,
headers: request.headers,
})
if (req.uploadData) {
req.write(request.uploadData.bytes)
}
req.on('error', (error) => {
callback(
errorResponse(error)
)
})
req.on('response', (res) => {
console.log(res.statusCode, res.headers)
callback(
createResponse(
res.statusCode,
res.headers,
res,
)
)
})
req.end()
}
catch (err) {
callback(
errorResponse(err)
)
}
}
function toStream(body) {
const stream = new PassThrough()
stream.write(body)
stream.end()
return stream
}
function errorResponse(error) {
return createResponse(
500,
{
'content-type': 'text/plain;charset=utf8',
},
error.stack
)
}
function createResponse(statusCode, headers, body) {
if ('content-length' in headers === false) {
headers['content-length'] = Buffer.byteLength(body)
}
return {
statusCode,
headers,
data: typeof body === 'object' ? body : toStream(body),
}
}
MY_HOST is any non-existent host (like something.example) or host that is controlled by admin (in my case it could be electron-app.rumk.in). This host will serve as replacement for localhost.
index.html
<html>
<body>
Hello
</body>
</html>

Migrate Google Workbox setDefaultHandler // setCatchHandler from v2 to v3

I'm trying to migrate my old code from google workbox v2 to workbox v3, and i can't use workbox.routing.registerNavigationRoute because my default route '/' (which is where my appshell is) is a runtime cache (because it's for a multilingual website https://www.autovisual.com with languages put in subfolder '/fr', '/es' ... with a unique Service-Worker scoped at '/').
This is the v2 code :
workboxSW.router.setDefaultHandler({
handler: ({
event
}) => {
return fetch(event.request);
}
});
workboxSW.router.setCatchHandler({
handler: ({
event
}) => {
if (event.request.mode === 'navigate') {
return caches.match('/');
}
return new Response();
}
});
It seems pretty basic : the goal is to catch all request 'navigate' that didn't match any other route and send the cached version, network first, of the url '/'.
For the info in the client js i use :
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
caches.open('rootCacheNetworkFirst').then(function(cache) {
cache.match('/').then(function(response) {
if (!response) {
cache.add('/');
}
});
});
navigator.serviceWorker.register('/sw.js', {
scope: "/"
});
});
}
I can't find any example with the new v3 workbox.routing.setDefaultHandler and workbox.routing.setCatchHandler and i'm stuck :(
I don't think that using either setDefaultHandler or setCatchHandler is relevant for that described use case.
To accomplish what you describe, add the following code to your service worker file after all other routes are registered. (In Workbox v3, the first-registered-route takes precedence.) You just need to configure a NavigationRoute and register it:
const networkFirst = workbox.strategies.networkFirst({
cacheName: 'your-cache-name',
});
const navigationRoute = new workbox.routing.NavigationRoute(networkFirst, {
// Set blacklist/whitelist if you need more control
// over which navigations are picked up.
blacklist: [],
whitelist: [],
});
workbox.router.registerRoute(navigationRoute);

Service worker spreading

I have a service worker for caching images, this service worker is only registered within the frontend template but it still keeps spreading into my admin template.
This causes my forms to behave unpredictably as the validation tokens get impacted with it.
With some console.log I figured the install event is triggered before getting to the requested page but I'm unable to determine the current/next URL there.
How can I prevent the service worker to spreading to the admin panel and interfere with the pages? I just want only assets to be cached.
This is my service worker as far as that is relevant:
const PRECACHE = 'precache-v1.0.0';
const RUNTIME = 'runtime';
// A list of local resources we always want to be cached.
const PRECACHE_URLS = [
"public",
"media",
"unify",
];
importScripts('./cache-polyfill.js');
// The install handler takes care of precaching the resources we always need.
self.addEventListener('install', function(event) {
console.log('installing resources');
event.waitUntil(
caches.open(PRECACHE)
//.then(cache => cache.addAll(PRECACHE_URLS))
.then(self.skipWaiting())
);
});
// The activate handler takes care of cleaning up old caches.
self.addEventListener('activate', function(event) {
const currentCaches = [PRECACHE, RUNTIME];
event.waitUntil(
caches.keys().then(cacheNames => {
return cacheNames.filter(cacheName => !currentCaches.includes(cacheName));
}).then(cachesToDelete => {
return Promise.all(cachesToDelete.map(cacheToDelete => {
return caches.delete(cacheToDelete);
}));
}).then(() => self.clients.claim())
);
});
// The fetch handler serves responses for same-origin resources from a cache.
// If no response is found, it populates the runtime cache with the response
// from the network before returning it to the page.
self.addEventListener('fetch', event => {
// Skip cross-origin requests, like those for Google Analytics.
if (event.request.method === "GET") {
if (event.request.url.indexOf(PRECACHE_URLS) > -1) {
console.log("fetching " + event.request.url + " by the service worker");
event.respondWith(
caches.match(event.request).then(cachedResponse => {
if (cachedResponse) {
return cachedResponse;
}
return caches.open(RUNTIME).then(cache => {
return fetch(event.request).then(response => {
// Put a copy of the response in the runtime cache.
return cache.put(event.request, response.clone()).then(() => {
console.log('cached: ' + event.request.url);
return response;
});
});
});
})
);
}
else {
console.log("fetching " + event.request.url + " by service worker blocked, it's not a resource");
}
}
return fetch(event.request);
});
The problem is most likely that your admin pages lie inside the SW scope. This means that your SW controls eg. everything in / and your admin pages are located in /admin/ or something.
You can prevent the behaviour by checking the fetch requests your SW is intercepting. Something like:
if (event.request.url.match('^.*(\/admin\/).*$')) {
return false;
}
This should be the first thing in the SW's fetch listener. It checks whether it received a request for something from the admin pages and then cancels out if it did. Otherwise, it continues normally.

Service Worker caching not recognizing timeout as a function

I was watching Steve Sanderson's NDC presentation on up-and-coming web features, and saw his caching example as a prime candidate for an application I am developing. I couldn't find the code, so I have typed it up off the Youtube video as well as I could.
Unfortunately it doesn't work in Chrome (which is also what he is using in the demo) It fails with Uncaught TypeError: fetch(...).then(...).timeout is not a function
at self.addEventListener.event.
I trawled through Steve's Github, and found no trace of this, nor could I find anything on the NDC Conference page
//inspiration:
// https://www.youtube.com/watch?v=MiLAE6HMr10
//self.importScripts('scripts/util.js');
console.log('Service Worker script running');
self.addEventListener('install', event => {
console.log('WORKER: installing');
const urlsToCache = ['/ServiceWorkerExperiment/', '/ServiceWorkerExperiment/scripts/page.js'];
caches.delete('mycache');
event.waitUntil(
caches.open('mycache')
.then(cache => cache.addAll(urlsToCache))
.then(_ => self.skipWaiting())
);
});
self.addEventListener('fetch', event => {
console.log(`WORKER: Intercepted request for ${event.request.url}`);
if (event.request.method !== 'GET') {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
console.log(`WORKER: Updating cached data for ${event.request.url}`);
var responseClone = networkResponse.clone();
caches.open('mycache').then(cache => cache.put(event.request, responseClone));
return networkResponse;
})
//if network fails or is too slow, return cached data
//reference for this code: https://youtu.be/MiLAE6HMr10?t=1003
.timeout(200)
.catch(_ => {
console.log(`WORKER: Serving ${event.request.url} from CACHE`);
return caches.match(event.request);
})
);
});
As far as I read the fetch() documentation, there is no timeout function, so my assumption is that the timeout function is added in the util.js which is never shown in the presentation... can anyone confirm this? and does anyone have an Idea about how this is implemented?
Future:
It's coming.
According to Jake Archibald's comment on whatwg/fetch the future syntax will be:
Using the abort syntax, you'll be able to do:
const controller = new AbortController();
const signal = controller.signal;
const fetchPromise = fetch(url, {signal});
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000);
const response = await fetchPromise;
// …
If you only wanted to timeout the request, not the response, add:
clearTimeout(timeoutId);
// …
And from another comment:
Edge & Firefox are already implementing. Chrome will start shortly.
Now:
If you want to try the solution that works now, the most sensible way is to use this module.
It allows you to use syntax like:
return fetch('/path', {timeout: 500}).then(function() {
// successful fetch
}).catch(function(error) {
// network request failed / timeout
})

Ignore ajax requests in service worker

I have an app with a basic 'shell' of HTML, CSS and JS. The main content of the page is loaded via multiple ajax calls to an API that is at another URL to the one my app is running on. I have set up a service-worker to cache the main 'shell' of the application:
var urlsToCache = [
'/',
'styles/main.css',
'scripts/app.js',
'scripts/apiService.js',
'third_party/handlebars.min.js',
'third_party/handlebars-intl.min.js'
];
and to respond with the cached version when requested. The problem I am having is that the response of my ajax calls are also being cached. I'm pretty sure that I need to add some code to the fetch event of the service-worker that always get them from the network rather than looking in the cache.
Here is my fetch event:
self.addEventListener('fetch', function (event) {
// ignore anything other than GET requests
var request = event.request;
if (request.method !== 'GET') {
event.respondWith(fetch(request));
return;
}
// handle other requests
event.respondWith(
caches.open(CACHE_NAME).then(function (cache) {
return cache.match(event.request).then(function (response) {
return response || fetch(event.request).then(function (response) {
cache.put(event.request, response.clone());
return response;
});
});
})
);
});
I'm not sure how I can ignore the requests to the API. I've tried doing this:
if (request.url.indexOf(myAPIUrl !== -1) {
event.respondWith(fetch(request));
return;
}
but according to the network tab in Chrome Dev Tools, all of these responses are still coming from the service-worker.
You do not have to use event.respondWith(fetch(request)) to handle requests that you want to ignore. If you return without calling event.respondWith browser will fetch the resource for you.
You can do something like:
if (request.method !== 'GET') { return; }
if (request.url.indexOf(myAPIUrl) !== -1) { return; }
\\ handle all other requests
event.respondWith(/* return promise here */);
IOW as long as you can determine synchronously that you don't want to handle the request you can just return from the handler and let the default request processing to take over. Check out this example.

Resources