Service worker spreading - service-worker

I have a service worker for caching images, this service worker is only registered within the frontend template but it still keeps spreading into my admin template.
This causes my forms to behave unpredictably as the validation tokens get impacted with it.
With some console.log I figured the install event is triggered before getting to the requested page but I'm unable to determine the current/next URL there.
How can I prevent the service worker to spreading to the admin panel and interfere with the pages? I just want only assets to be cached.
This is my service worker as far as that is relevant:
const PRECACHE = 'precache-v1.0.0';
const RUNTIME = 'runtime';
// A list of local resources we always want to be cached.
const PRECACHE_URLS = [
"public",
"media",
"unify",
];
importScripts('./cache-polyfill.js');
// The install handler takes care of precaching the resources we always need.
self.addEventListener('install', function(event) {
console.log('installing resources');
event.waitUntil(
caches.open(PRECACHE)
//.then(cache => cache.addAll(PRECACHE_URLS))
.then(self.skipWaiting())
);
});
// The activate handler takes care of cleaning up old caches.
self.addEventListener('activate', function(event) {
const currentCaches = [PRECACHE, RUNTIME];
event.waitUntil(
caches.keys().then(cacheNames => {
return cacheNames.filter(cacheName => !currentCaches.includes(cacheName));
}).then(cachesToDelete => {
return Promise.all(cachesToDelete.map(cacheToDelete => {
return caches.delete(cacheToDelete);
}));
}).then(() => self.clients.claim())
);
});
// The fetch handler serves responses for same-origin resources from a cache.
// If no response is found, it populates the runtime cache with the response
// from the network before returning it to the page.
self.addEventListener('fetch', event => {
// Skip cross-origin requests, like those for Google Analytics.
if (event.request.method === "GET") {
if (event.request.url.indexOf(PRECACHE_URLS) > -1) {
console.log("fetching " + event.request.url + " by the service worker");
event.respondWith(
caches.match(event.request).then(cachedResponse => {
if (cachedResponse) {
return cachedResponse;
}
return caches.open(RUNTIME).then(cache => {
return fetch(event.request).then(response => {
// Put a copy of the response in the runtime cache.
return cache.put(event.request, response.clone()).then(() => {
console.log('cached: ' + event.request.url);
return response;
});
});
});
})
);
}
else {
console.log("fetching " + event.request.url + " by service worker blocked, it's not a resource");
}
}
return fetch(event.request);
});

The problem is most likely that your admin pages lie inside the SW scope. This means that your SW controls eg. everything in / and your admin pages are located in /admin/ or something.
You can prevent the behaviour by checking the fetch requests your SW is intercepting. Something like:
if (event.request.url.match('^.*(\/admin\/).*$')) {
return false;
}
This should be the first thing in the SW's fetch listener. It checks whether it received a request for something from the admin pages and then cancels out if it did. Otherwise, it continues normally.

Related

Background sync doesn't refresh page when back online

recently I started learning about service workers, background syncs... I implemented service worker and in install step I cached some files I want to show when offline.
self.addEventListener('install', (event) => {
event.waitUntil(
caches.open(CACHE)
.then((cache) => {
return cache.addAll([navigationIcon, offlineImage, offlineFallbackPage]);
})
);
});
I am listening to fetch event to catch when there is no internet connection so I can show offline page when then.
self.addEventListener('fetch', (event) => {
if (event.request.mode === 'navigate' || (event.request.method === 'GET'
&& event.request.headers.get('accept')
.includes('text/html'))) {
event.respondWith(
fetch(event.request.url)
.catch(() => {
// Return the offline page
return caches.match(offlineFallbackPage);
})
);
} else {
event.respondWith(caches.match(event.request)
.then((response) => {
return response || fetch(event.request);
}));
}
});
I also added background sync, so I can go back online when there is internet connection.
After registering service worker I added:
.then(swRegistration => swRegistration.sync.register('backOnline'))
And I listen to sync event in my service worker.
When I'm offline and go back online nothing happens. BUT when I delete my fetch event (don't show previously cached offline page) then page goes back online by itself (which I want to do when I have fetch event)
Does anyone know what should I add so my page can go back online by itself?
You can use navigator, Include it in your main js file that is cached or in your service-worker js file, just ensure it's cached
let onlineStatus = locate => {
if(navigator.onLine){
location.replace(locate)
}
}
let isOfflinePage = window.location.pathname == offlineFallbackPage ? true : false;
// kindly edit isOfflinePage to return true if it's offline page
if(isOfflinePage){
onlineStatus('index.html')
}
You can simply use location.reload() instead

What's the right way to implement offline fallback with workbox

I am implementing PWA into my project, I have setted up the serviceworker.js, and I am using workbox.js for cache routing and strategies.
1- I add the offline page to cache on install event, when a user first visit the site:
/**
* Add on install
*/
self.addEventListener('install', (event) => {
const urls = ['/offline/'];
const cacheName = workbox.core.cacheNames.runtime;
event.waitUntil(caches.open(cacheName).then((cache) => cache.addAll(urls)))
});
2- Catch & cache pages with a specific regex, like these:
https://website.com/posts/the-first-post
https://website.com/posts/
https://website.com/articles/
workbox.routing.registerRoute(
new RegExp('/posts|/articles'),
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages-cache'
})
);
3- Catch errors and display the offline page, when there's no internet connection.
/**
* Handling Offline Page fallback
*/
this.addEventListener('fetch', event => {
if (event.request.mode === 'navigate' || (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html'))) {
event.respondWith(
fetch(event.request.url).catch(error => {
// Return the offline page
return caches.match('/offline/');
})
);
}
else{
// Respond with everything else if we can
event.respondWith(caches.match(event.request)
.then(function (response) {
return response || fetch(event.request);
})
);
}
});
Now this is working for me so far if I visit for example: https://website.com/contact-us/ but if I visit any url within the scope I defined earlier for "pages-cache" like https://website.com/articles/231/ this would not return the /offline page since it's not in the user cache, and I would get a regular browser error.
There's an issue in how errors are handled, when there's a specific caching route by workbox.
Is this the best method to apply for offline fallback? how can I catch errors from these paths: '/articles' & '/posts' and display an offline page?
Please refer as well to this answer where there's a different
approach to applying the fallack with workbox, I tried it as well same
results. Not sure which is the accurate approach for this.
I found a way to do it right with workbox.
For each route I would add a fallback method like this:
const offlinePage = '/offline/';
/**
* Pages to cache
*/
workbox.routing.registerRoute(/\/posts.|\/articles/,
async ({event}) => {
try {
return await workbox.strategies.staleWhileRevalidate({
cacheName: 'cache-pages'
}).handle({event});
} catch (error) {
return caches.match(offlinePage);
}
}
);
In case of using network first strategy this is the method:
/**
* Pages to cache (networkFirst)
*/
var networkFirst = workbox.strategies.networkFirst({
cacheName: 'cache-pages'
});
const customHandler = async (args) => {
try {
const response = await networkFirst.handle(args);
return response || await caches.match(offlinePage);
} catch (error) {
return await caches.match(offlinePage);
}
};
workbox.routing.registerRoute(
/\/posts.|\/articles/,
customHandler
);
More details at workbox documentation here: Provide a fallback response to a route

Service Worker caching not recognizing timeout as a function

I was watching Steve Sanderson's NDC presentation on up-and-coming web features, and saw his caching example as a prime candidate for an application I am developing. I couldn't find the code, so I have typed it up off the Youtube video as well as I could.
Unfortunately it doesn't work in Chrome (which is also what he is using in the demo) It fails with Uncaught TypeError: fetch(...).then(...).timeout is not a function
at self.addEventListener.event.
I trawled through Steve's Github, and found no trace of this, nor could I find anything on the NDC Conference page
//inspiration:
// https://www.youtube.com/watch?v=MiLAE6HMr10
//self.importScripts('scripts/util.js');
console.log('Service Worker script running');
self.addEventListener('install', event => {
console.log('WORKER: installing');
const urlsToCache = ['/ServiceWorkerExperiment/', '/ServiceWorkerExperiment/scripts/page.js'];
caches.delete('mycache');
event.waitUntil(
caches.open('mycache')
.then(cache => cache.addAll(urlsToCache))
.then(_ => self.skipWaiting())
);
});
self.addEventListener('fetch', event => {
console.log(`WORKER: Intercepted request for ${event.request.url}`);
if (event.request.method !== 'GET') {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
console.log(`WORKER: Updating cached data for ${event.request.url}`);
var responseClone = networkResponse.clone();
caches.open('mycache').then(cache => cache.put(event.request, responseClone));
return networkResponse;
})
//if network fails or is too slow, return cached data
//reference for this code: https://youtu.be/MiLAE6HMr10?t=1003
.timeout(200)
.catch(_ => {
console.log(`WORKER: Serving ${event.request.url} from CACHE`);
return caches.match(event.request);
})
);
});
As far as I read the fetch() documentation, there is no timeout function, so my assumption is that the timeout function is added in the util.js which is never shown in the presentation... can anyone confirm this? and does anyone have an Idea about how this is implemented?
Future:
It's coming.
According to Jake Archibald's comment on whatwg/fetch the future syntax will be:
Using the abort syntax, you'll be able to do:
const controller = new AbortController();
const signal = controller.signal;
const fetchPromise = fetch(url, {signal});
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000);
const response = await fetchPromise;
// …
If you only wanted to timeout the request, not the response, add:
clearTimeout(timeoutId);
// …
And from another comment:
Edge & Firefox are already implementing. Chrome will start shortly.
Now:
If you want to try the solution that works now, the most sensible way is to use this module.
It allows you to use syntax like:
return fetch('/path', {timeout: 500}).then(function() {
// successful fetch
}).catch(function(error) {
// network request failed / timeout
})

ServiceWorker not receiving fetch requests

I am installing a service worker for the first time, and following the tutorial at: https://developers.google.com/web/fundamentals/getting-started/primers/service-workers
My service worker behaves as expected when installing and updating, but fetch requests are not triggered as expected.
var CACHE_NAME = 'test-cache-v1'
var urlsToCache = [
'/',
'/public/scripts/app.js'
]
self.addEventListener('install', function (event) {
console.log('Installing new service worker', event)
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function (cache) {
return cache.addAll(urlsToCache)
})
.catch(err => console.log('Error Caching', err))
)
})
self.addEventListener('fetch', function (event) {
console.log('Fetch req', event)
event.respondWith(
caches.match(event.request)
.then(function (response) {
console.log('Cache hit', response)
// Cache hit - return response
if (response) {
return response
}
return fetch(event.request)
.catch(e => console.log('Error matching cache', e))
}
)
)
})
I see 'Installing new service worker' outputted to the console when expected, but not 'Fetch req'. I am using Chrome devtools and have accessed the "Inspect" option next to the ServiceWorker under the Application tab.
If you listen for the activate event, and add in a call to clients.claim() inside that event, then your newly active service worker will take control over existing web pages in its scope, including the page that registered it. There's more information in this article on the service worker lifecycle. The following code is sufficient:
self.addEventListener('activate', () => self.clients.claim());
If you don't call clients.claim(), then the service worker will activate, but not control any of the currently open pages. It won't be until you navigate to the next page under its scope (or reload a current page) that the service worker will take control, and start intercepting network requests via its fetch handler.
On dynamic websites, be careful!
If service worker has scope: example.com/weather/
It does not have scope: example.com/weather
Especially on firebase which by default removes trailing slash
In this case, service worker will install, activate, and even cache files, but not receive ‘fetch’ events! Very hard to debug.
Add “trailingSlash”: true to firebase.json under ‘hosting’. This will solve the problem. Make sure to modify rewrite from:
{
"source": "/weather", "function": "weather"
}
To :
{
"source": "/weather/", "function": "weather"
}
As well as manifest.json
I found that Jeff Posnick's "clients.claim()" in the activate event handler was useful, but it was not enough to cache resources the first time the JS app runs. That is because on the first run the service worker has not finished activating when the JS starts loading its resources.
The following function lets the main app register the SW and then waits for it to activate before continuing to load resources:
/**
* Registers service worker and waits until it is activated or failed.
* #param js URI of service worker JS
* #param onReady function to call when service worker is activated or failed
* #param maxWait maximum time to wait in milliseconds
*/
function registerServiceWorkerAndWaitForActivated(js, onReady, maxWait) {
let bReady = false;
function setReady() {
if (!bReady) {
bReady = true;
onReady();
}
}
if ('serviceWorker' in navigator) {
setTimeout(setReady, maxWait || 1000);
navigator.serviceWorker.register(js).then((reg) => {
let serviceWorker = reg.installing || reg.waiting;
if (serviceWorker) {
serviceWorker.addEventListener("statechange", (e) => {
if (serviceWorker.state == "activated")
setReady();
});
} else {
if (!reg.active)
console.log("Unknown service worker state");
setReady();
}
}, () => setReady());
} else {
let msg = "ServiceWorker not available. App will not run offline."
if (document.location.protocol != "https:")
msg = "Please use HTTPS so app can run offline later.";
console.warn(msg);
alert(msg);
setReady();
}
}

Ignore ajax requests in service worker

I have an app with a basic 'shell' of HTML, CSS and JS. The main content of the page is loaded via multiple ajax calls to an API that is at another URL to the one my app is running on. I have set up a service-worker to cache the main 'shell' of the application:
var urlsToCache = [
'/',
'styles/main.css',
'scripts/app.js',
'scripts/apiService.js',
'third_party/handlebars.min.js',
'third_party/handlebars-intl.min.js'
];
and to respond with the cached version when requested. The problem I am having is that the response of my ajax calls are also being cached. I'm pretty sure that I need to add some code to the fetch event of the service-worker that always get them from the network rather than looking in the cache.
Here is my fetch event:
self.addEventListener('fetch', function (event) {
// ignore anything other than GET requests
var request = event.request;
if (request.method !== 'GET') {
event.respondWith(fetch(request));
return;
}
// handle other requests
event.respondWith(
caches.open(CACHE_NAME).then(function (cache) {
return cache.match(event.request).then(function (response) {
return response || fetch(event.request).then(function (response) {
cache.put(event.request, response.clone());
return response;
});
});
})
);
});
I'm not sure how I can ignore the requests to the API. I've tried doing this:
if (request.url.indexOf(myAPIUrl !== -1) {
event.respondWith(fetch(request));
return;
}
but according to the network tab in Chrome Dev Tools, all of these responses are still coming from the service-worker.
You do not have to use event.respondWith(fetch(request)) to handle requests that you want to ignore. If you return without calling event.respondWith browser will fetch the resource for you.
You can do something like:
if (request.method !== 'GET') { return; }
if (request.url.indexOf(myAPIUrl) !== -1) { return; }
\\ handle all other requests
event.respondWith(/* return promise here */);
IOW as long as you can determine synchronously that you don't want to handle the request you can just return from the handler and let the default request processing to take over. Check out this example.

Resources