Im having trouble with my Service Worker. I have implemented it with the Cache then Network technique, where content is first fetched from cache, and a network-fetch is always performed and the result is cached at success. (Inspired by this solution, CSS-Tricks)
When I make changes to my web app and hit refresh, I of course, the first time get the old content. But on subsequent refreshes the content alternates between old and new. I can get new or old content five times in a row or it could differ on each request.
I have been debugging the Service Worker for a while now and does not get any wiser. Does anyone have an idea about whats wrong with the implementation?
EDIT:
var version = 'v1::2';
self.addEventListener("install", function (event) {
event.waitUntil(
caches
.open(version + 'fundamentals')
.then(function (cache) {
return cache.addAll([
"/"
]);
})
);
});
self.addEventListener("fetch", function (event) {
if (event.request.method !== 'GET') {
return;
}
event.respondWith(
caches
.match(event.request)
.then(function (cached) {
var networked = fetch(event.request)
.then(fetchedFromNetwork, unableToResolve)
.catch(unableToResolve);
return cached || networked;
function fetchedFromNetwork(response) {
var cacheCopy = response.clone();
caches
.open(version + 'pages')
.then(function add(cache) {
cache.put(event.request, cacheCopy);
});
return response;
}
function unableToResolve() {
return new Response('<h1>Service Unavailable</h1>', {
status: 503,
statusText: 'Service Unavailable',
headers: new Headers({
'Content-Type': 'text/html'
})
});
}
})
);
});
self.addEventListener("activate", function (event) {
event.waitUntil(
caches
.keys()
.then(function (keys) {
return Promise.all(
keys
.filter(function (key) {
return !key.startsWith(version);
})
.map(function (key) {
return caches.delete(key);
})
);
})
);
});
I don't see how you are setting the version, but I presume multiple caches still exist (I can see you are trying to delete the previous caches but still). caches.match() is a convenience method and the order is not guaranteed (at least Chrome seems to query the oldest one first). Chrome Developer Tools shows you the existing caches (Application/Cache/Cache Storage) and their contents. If you want to query a specific cache, you'll need to do:
caches.open(currentCacheName).then(function(cache) {...}
as in the example in the Cache documentation.
Related
I'm trying to build a webapp that will work offline.
I found JS Service worker and i have now implemented it in my app to store some static pages.
Now i'd like to build a HTML FORM where the user fills in stuff that will be saved to the cache if the user is offline.. but directly sends to mysql when the user is online.
It will be like a list with querys that will execute when user comes online.
How can i save a query string to the cache, and then check if online and send it with Ajax? to php and mySQL.
First off, how do i save a query string to the cache?
Second.. how do i find out when online and then fetch the query string from the cache?
This is what i got to cache my pages:
importScripts('cache-polyfill.js');
self.addEventListener('install', function(e) {
e.waitUntil(
caches.open('offlineList').then(function(cache) {
return cache.addAll([
'/app/offline_content/'
]);
})
);
});
self.addEventListener('fetch', function(event) {
console.log(event.request.url);
event.respondWith(
caches.match(event.request).then(function(response) {
return response || fetch(event.request);
})
);
});
EDIT
I'm now reading up on HTML/JS "localStorage"..
Well i solved this by adding data to localStorage:
//ADD DATA
localStorage.setItem(key, JSON.stringify(value));
Then i check for internet connection:
//CHECK INTERNET CONNECTION
const checkOnlineStatus = async () => { //console.log('CHECKING INTERNET..');
try {
const online = await fetch("/img.gif");
return online.status >= 200 && online.status < 300; // either true or false
} catch (err) {
return false; // definitely offline
}
};
const result = await checkOnlineStatus();
result ? updateMysql() : console.log('NO INTERNET..');
In the updateMysql() function i load all the localStorage and send it with ajax to php and mySQL.
var query = JSON.parse(localStorage.getItem(key));
I am using below code to purge workbox created cache but it also deletes the precache which is managed by workbox it selves.
Please let me know if better way exists.
// Clean up caches in activate event to ensure no pages are using the old caches.
self.addEventListener('activate', (event) => {
const promiseChain = caches.keys()
.then((cacheNames) => {
// Step through each cache name and delete it
return Promise.all(
cacheNames.map((cacheName) => caches.delete(cacheName))
);
});
// Keep the service worker alive until all caches are deleted.
event.waitUntil(promiseChain);
});
Below piece of code works fine to delete other caches while keeping the precache in workbox service worker.
// Clear old caches
var clearOldCaches = function (event)
{
event.waitUntil(
caches.keys().then(function (cacheNames) {
let validCacheSet = new Set(Object.values(workbox.core.cacheNames));
return Promise.all(
cacheNames
.filter(function (cacheName) {
return !validCacheSet.has(cacheName);
})
.map(function (cacheName) {
return caches.delete(cacheName);
})
);
})
);
};
self.addEventListener("activate", function (event) {
clearOldCaches(event);
});
I am implementing PWA into my project, I have setted up the serviceworker.js, and I am using workbox.js for cache routing and strategies.
1- I add the offline page to cache on install event, when a user first visit the site:
/**
* Add on install
*/
self.addEventListener('install', (event) => {
const urls = ['/offline/'];
const cacheName = workbox.core.cacheNames.runtime;
event.waitUntil(caches.open(cacheName).then((cache) => cache.addAll(urls)))
});
2- Catch & cache pages with a specific regex, like these:
https://website.com/posts/the-first-post
https://website.com/posts/
https://website.com/articles/
workbox.routing.registerRoute(
new RegExp('/posts|/articles'),
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages-cache'
})
);
3- Catch errors and display the offline page, when there's no internet connection.
/**
* Handling Offline Page fallback
*/
this.addEventListener('fetch', event => {
if (event.request.mode === 'navigate' || (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html'))) {
event.respondWith(
fetch(event.request.url).catch(error => {
// Return the offline page
return caches.match('/offline/');
})
);
}
else{
// Respond with everything else if we can
event.respondWith(caches.match(event.request)
.then(function (response) {
return response || fetch(event.request);
})
);
}
});
Now this is working for me so far if I visit for example: https://website.com/contact-us/ but if I visit any url within the scope I defined earlier for "pages-cache" like https://website.com/articles/231/ this would not return the /offline page since it's not in the user cache, and I would get a regular browser error.
There's an issue in how errors are handled, when there's a specific caching route by workbox.
Is this the best method to apply for offline fallback? how can I catch errors from these paths: '/articles' & '/posts' and display an offline page?
Please refer as well to this answer where there's a different
approach to applying the fallack with workbox, I tried it as well same
results. Not sure which is the accurate approach for this.
I found a way to do it right with workbox.
For each route I would add a fallback method like this:
const offlinePage = '/offline/';
/**
* Pages to cache
*/
workbox.routing.registerRoute(/\/posts.|\/articles/,
async ({event}) => {
try {
return await workbox.strategies.staleWhileRevalidate({
cacheName: 'cache-pages'
}).handle({event});
} catch (error) {
return caches.match(offlinePage);
}
}
);
In case of using network first strategy this is the method:
/**
* Pages to cache (networkFirst)
*/
var networkFirst = workbox.strategies.networkFirst({
cacheName: 'cache-pages'
});
const customHandler = async (args) => {
try {
const response = await networkFirst.handle(args);
return response || await caches.match(offlinePage);
} catch (error) {
return await caches.match(offlinePage);
}
};
workbox.routing.registerRoute(
/\/posts.|\/articles/,
customHandler
);
More details at workbox documentation here: Provide a fallback response to a route
I have a service worker for caching images, this service worker is only registered within the frontend template but it still keeps spreading into my admin template.
This causes my forms to behave unpredictably as the validation tokens get impacted with it.
With some console.log I figured the install event is triggered before getting to the requested page but I'm unable to determine the current/next URL there.
How can I prevent the service worker to spreading to the admin panel and interfere with the pages? I just want only assets to be cached.
This is my service worker as far as that is relevant:
const PRECACHE = 'precache-v1.0.0';
const RUNTIME = 'runtime';
// A list of local resources we always want to be cached.
const PRECACHE_URLS = [
"public",
"media",
"unify",
];
importScripts('./cache-polyfill.js');
// The install handler takes care of precaching the resources we always need.
self.addEventListener('install', function(event) {
console.log('installing resources');
event.waitUntil(
caches.open(PRECACHE)
//.then(cache => cache.addAll(PRECACHE_URLS))
.then(self.skipWaiting())
);
});
// The activate handler takes care of cleaning up old caches.
self.addEventListener('activate', function(event) {
const currentCaches = [PRECACHE, RUNTIME];
event.waitUntil(
caches.keys().then(cacheNames => {
return cacheNames.filter(cacheName => !currentCaches.includes(cacheName));
}).then(cachesToDelete => {
return Promise.all(cachesToDelete.map(cacheToDelete => {
return caches.delete(cacheToDelete);
}));
}).then(() => self.clients.claim())
);
});
// The fetch handler serves responses for same-origin resources from a cache.
// If no response is found, it populates the runtime cache with the response
// from the network before returning it to the page.
self.addEventListener('fetch', event => {
// Skip cross-origin requests, like those for Google Analytics.
if (event.request.method === "GET") {
if (event.request.url.indexOf(PRECACHE_URLS) > -1) {
console.log("fetching " + event.request.url + " by the service worker");
event.respondWith(
caches.match(event.request).then(cachedResponse => {
if (cachedResponse) {
return cachedResponse;
}
return caches.open(RUNTIME).then(cache => {
return fetch(event.request).then(response => {
// Put a copy of the response in the runtime cache.
return cache.put(event.request, response.clone()).then(() => {
console.log('cached: ' + event.request.url);
return response;
});
});
});
})
);
}
else {
console.log("fetching " + event.request.url + " by service worker blocked, it's not a resource");
}
}
return fetch(event.request);
});
The problem is most likely that your admin pages lie inside the SW scope. This means that your SW controls eg. everything in / and your admin pages are located in /admin/ or something.
You can prevent the behaviour by checking the fetch requests your SW is intercepting. Something like:
if (event.request.url.match('^.*(\/admin\/).*$')) {
return false;
}
This should be the first thing in the SW's fetch listener. It checks whether it received a request for something from the admin pages and then cancels out if it did. Otherwise, it continues normally.
I have an app with a basic 'shell' of HTML, CSS and JS. The main content of the page is loaded via multiple ajax calls to an API that is at another URL to the one my app is running on. I have set up a service-worker to cache the main 'shell' of the application:
var urlsToCache = [
'/',
'styles/main.css',
'scripts/app.js',
'scripts/apiService.js',
'third_party/handlebars.min.js',
'third_party/handlebars-intl.min.js'
];
and to respond with the cached version when requested. The problem I am having is that the response of my ajax calls are also being cached. I'm pretty sure that I need to add some code to the fetch event of the service-worker that always get them from the network rather than looking in the cache.
Here is my fetch event:
self.addEventListener('fetch', function (event) {
// ignore anything other than GET requests
var request = event.request;
if (request.method !== 'GET') {
event.respondWith(fetch(request));
return;
}
// handle other requests
event.respondWith(
caches.open(CACHE_NAME).then(function (cache) {
return cache.match(event.request).then(function (response) {
return response || fetch(event.request).then(function (response) {
cache.put(event.request, response.clone());
return response;
});
});
})
);
});
I'm not sure how I can ignore the requests to the API. I've tried doing this:
if (request.url.indexOf(myAPIUrl !== -1) {
event.respondWith(fetch(request));
return;
}
but according to the network tab in Chrome Dev Tools, all of these responses are still coming from the service-worker.
You do not have to use event.respondWith(fetch(request)) to handle requests that you want to ignore. If you return without calling event.respondWith browser will fetch the resource for you.
You can do something like:
if (request.method !== 'GET') { return; }
if (request.url.indexOf(myAPIUrl) !== -1) { return; }
\\ handle all other requests
event.respondWith(/* return promise here */);
IOW as long as you can determine synchronously that you don't want to handle the request you can just return from the handler and let the default request processing to take over. Check out this example.