Service worker installs via localhost, but fails when deployed to GitHub pages - service-worker

I am working on a PWA and I have installed and activated a service worker on my site. It works perfectly well while testing on local server but when I ship my code live, it fails.
This is my SW:
const cacheName = 'v1';
const cacheFiles = [
'/',
'/css/styles.css',
'/images/test1.png',
'/images/test2.png',
'/js/app.js',
'/js/sw-registration.js'
]
// Install event
self.addEventListener('install', function(event) {
console.log("SW installed");
event.waitUntil(
caches.open(cacheName)
.then(function(cache){
console.log('SW caching cachefiles');
return cache.addAll(cacheFiles);
})
)
});
// Activate event
self.addEventListener('activate', function(event) {
console.log("SW activated");
event.waitUntil(
caches.keys()
.then(function(cacheNames){
return Promise.all(cacheNames.map(function(thisCacheName){
if(thisCacheName !== cacheName){
console.log('SW Removing cached files from', thisCacheName);
return caches.delete(thisCacheName);
}
}))
})
)
});
// Fetch event
self.addEventListener('fetch', function(event) {
console.log("SW fetching", event.request.url);
event.respondWith(
caches.match(event.request)
.then(function(response){
console.log('Fetching new files');
return response || fetch(event.request);
})
);
});
This is the error I'm getting:
Uncaught (in promise) TypeError: Request failed (sw.js:1)
I don't understand why it fails to cache my files online (github pages) when it works locally. Can someone help me understand?
Thank you.
EDIT: I tried to deploy the site via Netlify and it works there. So it has to be something to do with Github pages. I would still like to know what it is, if anyone can shed any light.

As mentioned in Service Worker caches locally but fails online, when deploying to gh-pages, your web app's content will normally be accessed from a subpath, not in the top-level path, for the domain.
For instance, if your files are in the gh-pages branch of https://github.com/<user>/<repo>, then your web content can be accessed from https://<user>.github.io/<repo>/.
All of the URLs in your cacheFiles array are prefixed with /, which isn't what you want, given that all of your content is accessible under /<repo>/. For instance, / is interpreted as https://<user>.github.io/, which is different from https://<user>.github.io/<repo>/.
The solution to your problem, which will lead to a configuration that works regardless of what the base URL is for your hosting environment, is to prepend each of your URLs with ./ rather than /. For instance:
const cacheFiles = [
'./',
'./css/styles.css',
// etc.
];
The ./ means that the URL is relative, with the location of the service worker file used as the base. Your service worker file will be deployed under https://<user>.github.io/<repo>/, so that will end up being the correct base URL to use for the rest of your content as well.

Related

What to change to prevent double request from service worker?

Please do not mark as duplicate. This is not an exact duplicate of the other similar questions here on SO. It's more specific and fully reproducible.
Clone this repo.
yarn && yarn dev
Go to localhost:3000 and make sure under (F12)->Applications->Service workers, the service worker is installed.
Go to Network tab and refresh a few times(F5)
Observe how the network requests are doubled.
Example of what I see:
Or if you want to do it manually follow the instructions below:
yarn create-next-app app_name
cd app_name && yarn
in public folder, create file called service-worker.js and paste the following code:
addEventListener("install", (event) => {
self.skipWaiting();
console.log("Service worker installed!");
});
self.addEventListener("fetch", (event) => {
event.respondWith(
(async function () {
const promiseChain = fetch(event.request.clone()); // clone makes no difference
event.waitUntil(promiseChain); // makes no difference
return promiseChain;
})()
);
});
open pages/index.js and just below import Head from "next/head"; paste the following code:
if (typeof window !== "undefined" && "serviceWorker" in navigator) {
window.addEventListener("load", function () {
// there probably needs to be some check if sw is already registered
navigator.serviceWorker
.register("/service-worker.js", { scope: "/" })
.then(function (registration) {
console.log("SW registered: ", registration);
})
.catch(function (registrationError) {
console.log("SW registration failed: ", registrationError);
});
});
}
yarn dev
go to localhost:3000 and make sure the service worker has been loaded under (F12)Applications/Service Workers
Go to the Network tab and refresh the page a few times. See how the service worker sends two requests for each one
What do I need to change in the service-worker.js code so that there are no double requests?
This is how Chrome DevTools shows requests and is expected.
There is a request for a resource from the client JavaScript to the Service Worker and a request from the Service Worker to the server. This will always happen unless the service worker has the response cached and does not need to check the server for an update.
Does not seems the right way to initialize service worker in Next.js.You may need to look into next-pwa plugin to do it right.Here is the tutorial PWA with Next.js
If anyone is looking for an answer to the original question 'What to change to prevent double request from service worker?', specifically for network requests.
I've found a way to prevent it. Use the following in the serviceworker.js. (This also works for api calls etc.)
self.addEventListener('fetch', async function(event) {
await new Promise(function(res){setTimeout(function(){res("fetch request allowed")}, 9999)})
return false
});

Workbox precache putting items inside -temp cache, and not using it later [duplicate]

To enable my app running offline. During installation the service worker should:
fetch a list of URLs from an async API
reformat the response
add all URLs in the response to the precache
For this task I use Googles Workbox in combination with Webpack.
The problem: While the service worker successfully caches all the Webpack assets (which tells me that the workbox basically does what it should), it does not wait for the async API call to cache the additional remote assets. They are simply ignored and neither cached nor ever fetched in the network.
Here is my service worker code:
importScripts('https://storage.googleapis.com/workbox-cdn/releases/3.1.0/workbox-sw.js');
workbox.skipWaiting();
workbox.clientsClaim();
self.addEventListener('install', (event) => {
const precacheController = new
workbox.precaching.PrecacheController();
const preInstallUrl = 'https://www.myapiurl/Assets';
event.waitUntil(fetch(preInstallUrl)
.then(response => response.json()
.then((Assets) => {
Object.keys(Assets.data.Assets).forEach((key) => {
precacheController.addToCacheList([Assets.data.Assets[key]]);
});
})
);
});
self.__precacheManifest = [].concat(self.__precacheManifest || []);
workbox.precaching.suppressWarnings();
workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
workbox.routing.registerRoute(/^.*\.(jpg|JPG|gif|GIF|png|PNG|eot|woff(2)?|ttf|svg)$/, workbox.strategies.cacheFirst({ cacheName: 'image-cache', plugins: [new workbox.cacheableResponse.Plugin({ statuses: [0, 200] }), new workbox.expiration.Plugin({ maxEntries: 600 })] }), 'GET');
And this is my webpack configuration for the workbox:
new InjectManifest({
swDest: 'sw.js',
swSrc: './src/sw.js',
globPatterns: ['dist/*.{js,png,html,css,gif,GIF,PNG,JPG,jpeg,woff,woff2,ttf,svg,eot}'],
maximumFileSizeToCacheInBytes: 5 * 1024 * 1024,
})
It looks like you're creating your own PrecacheController instance and also using the precacheAndRoute(), which aren't actually intended to be used together (not super well explained in the docs, it's only mentioned in this one place).
The problem is the helper methods on workbox.precaching.* actually create their own PrecacheController instance under the hood. Since you're creating your own PrecacheController instance and also calling workbox.precaching.precacheAndRoute([...]), you'll end up with two PrecacheController instances that aren't working together.
From your code sample, it looks like you're creating a PrecacheController instance because you want to load your list of files to precache at runtime. That's fine, but if you're going to do that, there are a few things to be aware of:
Your SW might not update
Service worker updates are usually triggered when you call navigator.serviceWorker.register() and the browser detects that the service worker file has changed. That means if you change what /Assets returns but the contents of your service worker files haven't change, your service worker won't update. This is why most people hard-code their precache list in their service worker (since any changes to those files will trigger a new service worker installation).
You'll have to manually add your own routes
I mentioned before that workbox.precaching.precacheAndRoute([...]) creates its own PrecacheController instance under the hood. It also adds its own fetch listener manually to respond to requests. That means if you're not using precacheAndRoute(), you'll have to create your own router and define your own routes. Here are the docs on how to create routes: https://developers.google.com/web/tools/workbox/modules/workbox-routing.
I realised my mistake. I hope this helps others as well. The problem was that I did not call precacheController.install() manually. While this function will be executed automatically it will not wait for additional precache files that are inserted asynchronously. This is why the function needs to be called after all the precaching happened. Here is the working code:
importScripts('https://storage.googleapis.com/workbox-cdn/releases/3.1.0/workbox-sw.js');
workbox.skipWaiting();
workbox.clientsClaim();
const precacheController = new workbox.precaching.PrecacheController();
// Hook into install event
self.addEventListener('install', (event) => {
// Get API URL passed as query parameter to service worker
const preInstallUrl = new URL(location).searchParams.get('preInstallUrl');
// Fetch precaching URLs and attach them to the cache list
const assetsLoaded = fetch(preInstallUrl)
.then(response => response.json())
.then((values) => {
Object.keys(values.data.Assets).forEach((key) => {
precacheController.addToCacheList([values.data.Assets[key]]);
});
})
.then(() => {
// After all assets are added install them
precacheController.install();
});
event.waitUntil(assetsLoaded);
});
self.__precacheManifest = [].concat(self.__precacheManifest || []);
workbox.precaching.suppressWarnings();
workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
workbox.routing.registerRoute(/^.*\.(jpg|JPG|gif|GIF|png|PNG|eot|woff(2)?|ttf|svg)$/, workbox.strategies.cacheFirst({ cacheName: 'image-cache', plugins: [new workbox.cacheableResponse.Plugin({ statuses: [0, 200] }), new workbox.expiration.Plugin({ maxEntries: 600 })] }), 'GET');

pwa - Service worker does not successfully serve the manifest's start_url

I am trying to add PWA functionality to an existing website that is hosted on Azure and uses Cloudflare CDN.
I have run the lighthouse testing tool on the site and it passes everything in the PWA section (e.g. service worker installed, served over https, manifest installed etc.) except:
"Service worker does not successfully serve the manifest's start_url."
My manifest.json has '/' as the start URL and "/" as the scope.
The '/' is actually default.aspx which I have cached as well.
My service worker caches '/', e.g.
var cacheShellFiles = [
'/',
'/manifest.json',
'/index.html',
'/scripts/app.js',
'/styles/inline.css'
...
]
// install - cache the app shell
self.addEventListener('install', function (event) {
console.log('From SW install: ', event);
// calling skipWatiing() means the sw will skip the waiting state and immediately
// activate even if other tabs open that use the previous sw
self.skipWaiting();
event.waitUntil(
caches.open(CACHE_NAME_SHELL)
.then(function (cache) {
console.log('Cache opened');
return cache.addAll(cacheShellFiles);
})
);
});
When I view the Cache Storage files in dev tools however, the Content-Length of the / and the .css and .js files is 0:
Image of Chrome Developer tools showing cache storage with Content-Length=0
Is the Content-Length = 0 the reason that it is saying it can't serve the manifest's start URL ?
This is an issue with your service worker's scope (different from the scope option in manifest.json).
Your start_url is set to /, but most likely your service worker file is served from a deeper path, e.g. /some-path/service-worker.js. In this case, your service worker's scope is /some-path/, therefore it will not be able to handle requests to paths outside of it, such as the root path /.
To fix this, you need to make sure that your service worker's scope covers your start_url. I can think of two ways to do this:
In your case, serve the service worker file directly from the root path, e.g. /service-worker.js.
Use the Service-Worker-Allowed response header, which overrides the service worker's scope, so that it wouldn't matter from which path the service worker file is served from.
Choose the one that is more appropriate to your setup.

Setting service worker to exclude certain urls only

I built an app using create react which by default includes a service worker. I want the app to be run anytime someone enters the given url except when they go to /blog/, which is serving a set of static content. I use react router in the app to catch different urls.
I have nginx setup to serve /blog/ and it works fine if someone visits /blog/ without visiting the react app first. However because the service worker has a scope of ./, anytime someone visits any url other than /blog/, the app loads the service worker. From that point on, the service worker bypasses a connection to the server and /blog/ loads the react app instead of the static contents.
Is there a way to have the service worker load on all urls except /blog/?
So, considering, you have not posted any code relevant to the service worker, you might consider adding a simple if conditional inside the code block for fetch
This code block should already be there inside your service worker.Just add the conditionals
self.addEventListener( 'fetch', function ( event ) {
if ( event.request.url.match( '^.*(\/blog\/).*$' ) ) {
return false;
}
// OR
if ( event.request.url.indexOf( '/blog/' ) !== -1 ) {
return false;
}
// **** rest of your service worker code ****
note you can either use the regex or the prototype method indexOf.
per your whim.
the above would direct your service worker, to just do nothing when the url matches /blog/
Another way to blacklist URLs, i.e., exclude them from being served from cache, when you're using Workbox can be achieved with workbox.routing.registerNavigationRoute:
workbox.routing.registerNavigationRoute("/index.html", {
blacklist: [/^\/api/,/^\/admin/],
});
The example above demonstrates this for a SPA where all routes are cached and mapped into index.html except for any URL starting with /api or /admin.
here's whats working for us in the latest CRA version:
// serviceWorker.js
window.addEventListener('load', () => {
if (isAdminRoute()) {
console.info('unregistering service worker for admin route')
unregister()
console.info('reloading')
window.location.reload()
return false
}
we exclude all routes under /admin from the server worker, since we are using a different app for our admin area. you can change it of course for anything you like, here's our function in the bottom of the file:
function isAdminRoute() {
return window.location.pathname.startsWith('/admin')
}
Here's how you do it in 2021:
import {NavigationRoute, registerRoute} from 'workbox-routing';
const navigationRoute = new NavigationRoute(handler, {
allowlist: [
new RegExp('/blog/'),
],
denylist: [
new RegExp('/blog/restricted/'),
],
});
registerRoute(navigationRoute);
How to Register a Navigation Route
If you are using or willing to use customize-cra, the solution is quite straight-forward.
Put this in your config-overrides.js:
const { adjustWorkbox, override } = require("customize-cra");
module.exports = override(
adjustWorkbox(wb =>
Object.assign(wb, {
navigateFallbackWhitelist: [
...(wb.navigateFallbackWhitelist || []),
/^\/blog(\/.*)?/,
],
})
)
);
Note that in the newest workbox documentation, the option is called navigateFallbackAllowlist instead of navigateFallbackWhitelist. So, depending on the version of CRA/workbox you use, you might need to change the option name.
The regexp /^/blog(/.*)?/ matches /blog, /blog/, /blog/abc123 etc.
Try using the sw-precache library to overwrite the current service-worker.js file that is running the cache strategy. The most important part is setting up the config file (i will paste the one I used with create-react-app below).
Install yarn sw-precache
Create and specify the config file which indicates which URLs to not cache
modify the build script command to make sure sw-precache runs and overwrites the default service-worker.js file in the build output directory
I named my config file sw-precache-config.js is and specified it in build script command in package.json. Contents of the file are below. The part to pay particular attention to is the runtimeCaching key/option.
"build": "NODE_ENV=development react-scripts build && sw-precache --config=sw-precache-config.js"
CONFIG FILE: sw-precache-config.js
module.exports = {
staticFileGlobs: [
'build/*.html',
'build/manifest.json',
'build/static/**/!(*map*)',
],
staticFileGlobsIgnorePatterns: [/\.map$/, /asset-manifest\.json$/],
swFilePath: './build/service-worker.js',
stripPrefix: 'build/',
runtimeCaching: [
{
urlPattern: /dont_cache_me1/,
handler: 'networkOnly'
}, {
urlPattern: /dont_cache_me2/,
handler: 'networkOnly'
}
]
}
Update (new working solution)
In the last major release of Create React App (version 4.x.x), you can easily implement your custom worker-service.js without bleeding. customize worker-service
Starting with Create React App 4, you have full control over customizing the logic in this service worker, by creating your own src/service-worker.js file, or customizing the one added by the cra-template-pwa (or cra-template-pwa-typescript) template. You can use additional modules from the Workbox project, add in a push notification library, or remove some of the default caching logic.
You have to upgrade your react script to version 4 if you are currently using older versions.
Working solution for CRA v4
Add the following code to the file service-worker.js inside the anonymous function in registerRoute-method.
// If this is a backend URL, skip
if (url.pathname.startsWith("/backend")) {
return false;
}
To simplify things, we can add an array list of items to exclude, and add a search into the fetch event listener.
Include and Exclude methods below for completeness.
var offlineInclude = [
'', // index.html
'sitecss.css',
'js/sitejs.js'
];
var offlineExclude = [
'/networkimages/bigimg.png', //exclude a file
'/networkimages/smallimg.png',
'/admin/' //exclude a directory
];
self.addEventListener("install", function(event) {
console.log('WORKER: install event in progress.');
event.waitUntil(
caches
.open(version + 'fundamentals')
.then(function(cache) {
return cache.addAll(offlineInclude);
})
.then(function() {
console.log('WORKER: install completed');
})
);
});
self.addEventListener("fetch", function(event) {
console.log('WORKER: fetch event in progress.');
if (event.request.method !== 'GET') {
console.log('WORKER: fetch event ignored.', event.request.method, event.request.url);
return;
}
for (let i = 0; i < offlineExclude.length; i++)
{
if (event.request.url.indexOf(offlineExclude[i]) !== -1)
{
console.log('WORKER: fetch event ignored. URL in exclude list.', event.request.url);
return false;
}
}

Manifest start_url is not cached by a Service Worker

I'm using Lighthouse to audit my webapp. I'm working through the failures, but I'm stuck on this one:
Failures: Manifest start_url is not cached by a Service Worker.
In my manifest.json I have
"start_url": "index.html",
In my worker.js I am caching the following:
let CACHE_NAME = 'my-site-cache-v1';
let urlsToCache = [
'/',
'/scripts/app.js',
'/index.html'
];
Which lines up with what I see in the Application tab in Chrome Dev tools:
So... why is it telling me start_url is not cached?
Here is my full worker.js file:
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('/worker.js').then(function(registration) {
console.log('ServiceWorker registration successful with scope: ', registration.scope);
}, function(err) {
console.log('ServiceWorker registration failed: ', err);
});
});
}
let CACHE_NAME = 'my-site-cache-v1.1';
let urlsToCache = [
'/',
'/scripts/app.js',
'/index.html'
];
self.addEventListener('install', function(event) {
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
Let's look at Lighthouse's source code
static assessOfflineStartUrl(artifacts, result) {
const hasOfflineStartUrl = artifacts.StartUrl.statusCode === 200;
if (!hasOfflineStartUrl) {
result.failures.push('Manifest start_url is not cached by a service worker');
}
}
We can notice, that it's not checking your cache, but response of the entry point. The reason for that must be that your service worker is not sending proper Response on fetch.
You'll know that it's working, if in DevTools, in your first request, there'll be (from ServiceWorker) in size column:
There're two problems with the code you've provided:
First one is that you're messing service worker code with service worker registration code. Service worker registration code should be the code executed on your webpage.
That code should be included on your page:
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('/worker.js').then(function(registration) {
console.log('ServiceWorker registration successful with scope: ', registration.scope);
}, function(err) {
console.log('ServiceWorker registration failed: ', err);
});
});
}
and the rest of what you've pasted should be your worker.js code. However service worker get installed, because you've files in cache, so I suspect you just pasted this incorrectly.
The second (real) problem is that service worker is not returning this cached files. As I've proved earlier, that error from lighthouse means that service worker is not returning start_url entry file.
The most basic code to achieve that is:
self.addEventListener('fetch', function(event) {
event.respondWith(caches.match(event.request));
});
Service worker is event-driven, so when your page wants to get some resource, service worker reacts, and serves the one from cache. In real world, you really don't want to use it like that, because you need some kind of fallback. I strongly recommend reading section Serving files from the cache here
Edit: I've created pull request in Lighthouse source code to clarify that error message
It seems to be that Chrome lighthouse (chrome v62) performs a generic fetch(). See discussion on https://github.com/GoogleChrome/lighthouse/issues/2688#issuecomment-315394447
In my case, an offline.html is served after an "if (event.request.mode === 'navigate'){".
Due to the use of lighthouse´s generic fetch(), lighthouse will not get served this offline.html, and shows the "Manifest start_url is not cached by a Service Worker" error.
I solved this problem by replacing:
if (event.request.mode === 'navigate'){
with
if (event.request.method === 'GET' ){

Resources