I am using Workbox in my service worker and I use this strategy to provide a fallback response to a route that should show when offline and page is not in cache:
const FALLBACK_URL = '/offline/';
const urlHandler = workbox.strategies.staleWhileRevalidate({
cacheName: 'page-cache'
});
workbox.routing.registerRoute(
/\/.+\//,
({event}) => {
return urlHandler.handle({event})
.catch(() => caches.match(FALLBACK_URL));
});
This works well (I have alreaedy cached FALLBACK_URL) when I use staleWhileRevalidate strategy and when I use networkOnly strategy. However, I would really like to use networkFirst strategy but when I try that I get the following error when visiting pages that are not in the cache:
and the console says
'The FetchEvent for "https://staging.bassbuddha.com/pedals/" resulted in a network error response: an object that was not a Response was passed to respondWith().'
What am I doing wrong?
I am using version 3.4.1
https://storage.googleapis.com/workbox-cdn/releases/3.4.1/workbox-sw.js
It turns out that this is intentional as per this issue on the Workbox GitHub repo. Here is the quote on networkFirst (as opposed to other strategies) by jeffposnick:
networkFirst does not end up throwing in that scenario because the
underlying network exception triggers a cache.match() lookup, and
cache.match() doesn't throw/reject when there's a cache miss. Instead,
it resolves with undefined.
So the solution to fallback with networkFirst with fallback in Workbook is to catch undefined in the response and match it to the fallback url like so:
const FALLBACK_URL = '/offline/';
const urlHandler = workbox.strategies.networkFirst({
cacheName: 'page-cache'
});
workbox.routing.registerRoute(
/\/.+\//,
({event}) => {
return urlHandler.handle({event})
.then((response) => {
return response || caches.match(FALLBACK_URL);
})
.catch(() => caches.match(FALLBACK_URL));
});
Related
I have api's that I am caching in my app. I would like to cache the api while the service worker is installing. I came across warming the cache:
import {cacheNames} from 'workbox-core';
self.addEventListener('install', (event) => {
const urls = [/* ... */];
const cacheName = cacheNames.runtime;
event.waitUntil(caches.open(cacheName).then((cache) => cache.addAll(urls)));
});
If you use strategies configured with a custom cache name you can do the same thing; just assign your custom value to cacheName.
1) I am using custom cache names. Would I use an array for multiple cache names? ie const cacheName = [ 'foo-api', 'bar'api']?
2) The url's I use are regexp /foo/. Will those rexexp urls work here?
3) Will I be able to cache the api while the service worker is installing before the browser consumes the api?
You can add as many items to as many caches as you'd like inside of your install handler.
Workbox can use RegExps for routing incoming fetch requests to an appropriate response handler, and I assume that's what you're referring to here. The answer is no, you can't just provide a RegExp if you want to cache URLs in advance—you need to provide a complete list of URLs.
Any caching that you perform inside of an install handler is guaranteed to happen before the service worker activates, and therefore before your fetch handlers start intercepting requests. So yes, this is a way of ensuring that your caches are pre-populated.
A modification of your code could look like:
self.addEventListener('install', (event) => {
const cacheURLs = async () => {
const cache1 = await caches.open('my-first-cache');
await cache1.addAll([
'/url1',
'/url2',
]);
const cache2 = await caches.open('my-second-cache');
await cache2.addAll([
'/url3',
'/url4',
]);
};
event.waitUntil(cacheURLs());
});
Is there a way to find if a request is XHR or fetch while using Workbox.
const matchCb = ({url, event}) => {
if(event.type === 'xhr')
{
return true;
}
return false;
};
workbox.routing.registerRoute(
matchCb,
workbox.strategies.networkOnly()
);
I have put a check so that the above route is used only for XHR calls.
Although network Tab of the browser shows a certain request to be of the type xhr it is coming out to be fetch on debugging the above code . Am i doing something wrong? Is there some other way to check it?
There's no way to determine that from within Workbox or inside of the service worker. (I'm also not sure why you would want to?)
One thing that you can do, however, is add in an extra request header when you make your request, and then check for that header inside of your service worker. If it's really important for you to distinguish between requests that originated via XHR and va fetch(), you could use the header for that.
Inside your web app:
const request = new Request('/api', {headers: {'X-Source': 'fetch'}});
const response = await fetch(request);
Inside your service worker, using Workbox:
workbox.routing.registerRoute(
// This matchCallback will only be true if the request
// has an X-Source header set to 'fetch':
({event}) => event.request.headers.get('X-Source') === 'fetch',
workbox.strategies.networkOnly()
);
Note that if you're making a cors request, you may need to delete that X-Source request header before sending it to the network, since extra request headers can trigger CORS preflight checks.
To enable my app running offline. During installation the service worker should:
fetch a list of URLs from an async API
reformat the response
add all URLs in the response to the precache
For this task I use Googles Workbox in combination with Webpack.
The problem: While the service worker successfully caches all the Webpack assets (which tells me that the workbox basically does what it should), it does not wait for the async API call to cache the additional remote assets. They are simply ignored and neither cached nor ever fetched in the network.
Here is my service worker code:
importScripts('https://storage.googleapis.com/workbox-cdn/releases/3.1.0/workbox-sw.js');
workbox.skipWaiting();
workbox.clientsClaim();
self.addEventListener('install', (event) => {
const precacheController = new
workbox.precaching.PrecacheController();
const preInstallUrl = 'https://www.myapiurl/Assets';
event.waitUntil(fetch(preInstallUrl)
.then(response => response.json()
.then((Assets) => {
Object.keys(Assets.data.Assets).forEach((key) => {
precacheController.addToCacheList([Assets.data.Assets[key]]);
});
})
);
});
self.__precacheManifest = [].concat(self.__precacheManifest || []);
workbox.precaching.suppressWarnings();
workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
workbox.routing.registerRoute(/^.*\.(jpg|JPG|gif|GIF|png|PNG|eot|woff(2)?|ttf|svg)$/, workbox.strategies.cacheFirst({ cacheName: 'image-cache', plugins: [new workbox.cacheableResponse.Plugin({ statuses: [0, 200] }), new workbox.expiration.Plugin({ maxEntries: 600 })] }), 'GET');
And this is my webpack configuration for the workbox:
new InjectManifest({
swDest: 'sw.js',
swSrc: './src/sw.js',
globPatterns: ['dist/*.{js,png,html,css,gif,GIF,PNG,JPG,jpeg,woff,woff2,ttf,svg,eot}'],
maximumFileSizeToCacheInBytes: 5 * 1024 * 1024,
})
It looks like you're creating your own PrecacheController instance and also using the precacheAndRoute(), which aren't actually intended to be used together (not super well explained in the docs, it's only mentioned in this one place).
The problem is the helper methods on workbox.precaching.* actually create their own PrecacheController instance under the hood. Since you're creating your own PrecacheController instance and also calling workbox.precaching.precacheAndRoute([...]), you'll end up with two PrecacheController instances that aren't working together.
From your code sample, it looks like you're creating a PrecacheController instance because you want to load your list of files to precache at runtime. That's fine, but if you're going to do that, there are a few things to be aware of:
Your SW might not update
Service worker updates are usually triggered when you call navigator.serviceWorker.register() and the browser detects that the service worker file has changed. That means if you change what /Assets returns but the contents of your service worker files haven't change, your service worker won't update. This is why most people hard-code their precache list in their service worker (since any changes to those files will trigger a new service worker installation).
You'll have to manually add your own routes
I mentioned before that workbox.precaching.precacheAndRoute([...]) creates its own PrecacheController instance under the hood. It also adds its own fetch listener manually to respond to requests. That means if you're not using precacheAndRoute(), you'll have to create your own router and define your own routes. Here are the docs on how to create routes: https://developers.google.com/web/tools/workbox/modules/workbox-routing.
I realised my mistake. I hope this helps others as well. The problem was that I did not call precacheController.install() manually. While this function will be executed automatically it will not wait for additional precache files that are inserted asynchronously. This is why the function needs to be called after all the precaching happened. Here is the working code:
importScripts('https://storage.googleapis.com/workbox-cdn/releases/3.1.0/workbox-sw.js');
workbox.skipWaiting();
workbox.clientsClaim();
const precacheController = new workbox.precaching.PrecacheController();
// Hook into install event
self.addEventListener('install', (event) => {
// Get API URL passed as query parameter to service worker
const preInstallUrl = new URL(location).searchParams.get('preInstallUrl');
// Fetch precaching URLs and attach them to the cache list
const assetsLoaded = fetch(preInstallUrl)
.then(response => response.json())
.then((values) => {
Object.keys(values.data.Assets).forEach((key) => {
precacheController.addToCacheList([values.data.Assets[key]]);
});
})
.then(() => {
// After all assets are added install them
precacheController.install();
});
event.waitUntil(assetsLoaded);
});
self.__precacheManifest = [].concat(self.__precacheManifest || []);
workbox.precaching.suppressWarnings();
workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
workbox.routing.registerRoute(/^.*\.(jpg|JPG|gif|GIF|png|PNG|eot|woff(2)?|ttf|svg)$/, workbox.strategies.cacheFirst({ cacheName: 'image-cache', plugins: [new workbox.cacheableResponse.Plugin({ statuses: [0, 200] }), new workbox.expiration.Plugin({ maxEntries: 600 })] }), 'GET');
I have a SSR based react app and at present implementing Workbox tools for precaching and offline capabilities into it. I ran into issues mainly because the site relies on cookies at server side and issues redirects based on these.
Initial load works fine, but once service worker(sw) is online and another refresh results in sw doing a fetch call with the url from within workbox source. During this time, the server doesn't find cookies(fetch doesn't carry credentials - link) and issues a redirect(302) to a different url(which lets you set some options into cookies and refreshes to old url).
This results in the following error on the client side The FetchEvent for "http://localhost:8080/" resulted in a network error response: a redirected response was used for a request whose redirect mode is not "follow".
The server issues redirect as 302 status, but the client results in:
site can’t be reached
The web page at http://localhost:8080/ might be temporarily down or it may have moved permanently to a new web address.
ERR_FAILED
Here is my service worker code and the assets are populated by workbox-webpack plugin.
/* global importScripts, WorkboxSW */
importScripts('/client/workbox-sw.prod.js')
// Create Workbox service worker instance
const workboxSW = new WorkboxSW({
clientsClaim: true,
cacheId: 'app-v3-sw',
})
// Placeholder array which is populated automatically by workboxBuild.injectManifest()
workboxSW.precache([])
// cache the offline html to be used as fallback navigation route.
workboxSW.router.registerRoute(
'/offline.html',
workboxSW.strategies.networkFirst({
networkTimeoutSeconds: 2,
cacheableResponse: { statuses: [0, 200] },
})
)
// cache google api requests.
workboxSW.router.registerRoute(
/\.googleapis\.com$/,
workboxSW.strategies.staleWhileRevalidate({
cacheName: 'v3-google-api-cache',
networkTimeoutSeconds: 2,
cacheableResponse: { statuses: [0, 200] },
})
)
// cache external requests.
workboxSW.router.registerRoute(
/(static\.clevertap\.com|www\.google-analytics\.com|wzrkt\.com|www\.googletagservices\.com|securepubads\.g\.doubleclick\.net|www\.googletagmanager\.com)/,
workboxSW.strategies.cacheFirst({
cacheName: 'v3-externals-cache',
cacheExpiration: {
maxEntries: 30,
},
cacheableResponse: { statuses: [0, 200] },
})
)
// Check if client can hit the url via network, if cannot then use the offline fallback html.
// https://github.com/GoogleChrome/workbox/issues/796
workboxSW.router.registerRoute(
({ event }) => event.request.mode === 'navigate',
({ url }) =>
// eslint-disable-next-line compat/compat
fetch(url.href).catch(() => caches.match('/offline.html'))
)
P.S.
This is my first try with workbox tools or service workers and I might have missed out or overseen some details. Kindly point me in some direction.
By default, fetch doesn't pass the cookies, so you will need to add the credentials in the options:
workboxSW.router.registerRoute(
({ event }) => event.request.mode === 'navigate',
({ url }) => fetch(url.href, {credentials: 'same-origin'}).catch(() => caches.match('/offline.html'))
)
More informations here: https://github.com/github/fetch#sending-cookies
I'm using Lighthouse to audit my webapp. I'm working through the failures, but I'm stuck on this one:
Failures: Manifest start_url is not cached by a Service Worker.
In my manifest.json I have
"start_url": "index.html",
In my worker.js I am caching the following:
let CACHE_NAME = 'my-site-cache-v1';
let urlsToCache = [
'/',
'/scripts/app.js',
'/index.html'
];
Which lines up with what I see in the Application tab in Chrome Dev tools:
So... why is it telling me start_url is not cached?
Here is my full worker.js file:
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('/worker.js').then(function(registration) {
console.log('ServiceWorker registration successful with scope: ', registration.scope);
}, function(err) {
console.log('ServiceWorker registration failed: ', err);
});
});
}
let CACHE_NAME = 'my-site-cache-v1.1';
let urlsToCache = [
'/',
'/scripts/app.js',
'/index.html'
];
self.addEventListener('install', function(event) {
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
Let's look at Lighthouse's source code
static assessOfflineStartUrl(artifacts, result) {
const hasOfflineStartUrl = artifacts.StartUrl.statusCode === 200;
if (!hasOfflineStartUrl) {
result.failures.push('Manifest start_url is not cached by a service worker');
}
}
We can notice, that it's not checking your cache, but response of the entry point. The reason for that must be that your service worker is not sending proper Response on fetch.
You'll know that it's working, if in DevTools, in your first request, there'll be (from ServiceWorker) in size column:
There're two problems with the code you've provided:
First one is that you're messing service worker code with service worker registration code. Service worker registration code should be the code executed on your webpage.
That code should be included on your page:
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('/worker.js').then(function(registration) {
console.log('ServiceWorker registration successful with scope: ', registration.scope);
}, function(err) {
console.log('ServiceWorker registration failed: ', err);
});
});
}
and the rest of what you've pasted should be your worker.js code. However service worker get installed, because you've files in cache, so I suspect you just pasted this incorrectly.
The second (real) problem is that service worker is not returning this cached files. As I've proved earlier, that error from lighthouse means that service worker is not returning start_url entry file.
The most basic code to achieve that is:
self.addEventListener('fetch', function(event) {
event.respondWith(caches.match(event.request));
});
Service worker is event-driven, so when your page wants to get some resource, service worker reacts, and serves the one from cache. In real world, you really don't want to use it like that, because you need some kind of fallback. I strongly recommend reading section Serving files from the cache here
Edit: I've created pull request in Lighthouse source code to clarify that error message
It seems to be that Chrome lighthouse (chrome v62) performs a generic fetch(). See discussion on https://github.com/GoogleChrome/lighthouse/issues/2688#issuecomment-315394447
In my case, an offline.html is served after an "if (event.request.mode === 'navigate'){".
Due to the use of lighthouse´s generic fetch(), lighthouse will not get served this offline.html, and shows the "Manifest start_url is not cached by a Service Worker" error.
I solved this problem by replacing:
if (event.request.mode === 'navigate'){
with
if (event.request.method === 'GET' ){