Save in Cache api response WORKBOX - service-worker

I would like to save some data that comes from an API in cache in case I lose the connection that data is shown
I have a list of Work Parts, so if I go offline I would like to continue seeing those parts, since when I enter the component again, it makes the call to the API and brings them again and since there is no connection, it leaves it in white here would be to bring them from cache
import {precacheAndRoute} from 'workbox-precaching';
import {clientsClaim, skipWaiting} from 'workbox-core';
import {registerRoute} from 'workbox-routing';
import {CacheFirst, NetworkFirst, NetworkOnly, StaleWhileRevalidate} from 'workbox-strategies';
import {CacheableResponsePlugin} from "workbox-cacheable-response";
import {BackgroundSyncPlugin} from 'workbox-background-sync';
import {Queue} from 'workbox-background-sync';
declare const self: ServiceWorkerGlobalScope;
skipWaiting();
clientsClaim();
const queue = new Queue('cola');
const bgSyncPlugin = new BackgroundSyncPlugin('api-cola', {
onSync: async ({queue}) => {
let entry;
while ((entry = await queue.shiftRequest())) {
try {
let clone = entry.request.clone();
await fetch(clone);
console.error("Replay successful for request", entry.request);
} catch (error) {
console.error("Replay failed for request", entry.request, error);
// Put the entry back in the queue and re-throw the error:
await queue.unshiftRequest(entry);
throw error;
}
}
console.log("Replay complete!");
}
});
registerRoute(
/\/api\/.*\/*.php/,
new NetworkOnly({
plugins: [bgSyncPlugin]
}),
'POST'
);
registerRoute(
({url}) => url.origin === 'https://xxx.xxx.com' &&
url.pathname.startsWith('/api/'),
new CacheFirst({
cacheName: 'api-cache',
plugins: [
new CacheableResponsePlugin({
statuses: [0, 200, 404],
})
]
})
);
registerRoute(
/assets\/images\/icons\/icon-.+\.png$/,
new CacheFirst({
cacheName: 'icons'
})
);
precacheAndRoute(self.__WB_MANIFEST);
When you go offline when you return the connection a sync is done and this works fine.

I noticed that you are using runtime caching, which means the app or the user would have to make the calls to the api at some point before they are offline so that the resources are available in the cache. Maybe warm strategy cache would work for you, if you know the urls before hand.
I also noticed that you are caching responses even if they return with a 404 code, so those would also not be displayed correctly.

Related

Disable relayjs garbage collection

Is there a way to disable the relayjs garbage collection (version 5.0.0 or 6.0.0)?
We are still using relayjs classic and it caches all data in a session. This makes loading previous pages fast while fetching new data. In relayjs 5.0.0 they have a dataFrom on the QueryRenderer that can be set to "STORE_THEN_NETWORK" which will try the relay cache store first and fetch from the network, just like rejay classic. Except that the newer versions of relay uses a garbage collection feature to remove data that is not currently used. This makes almost all pages fetch data from the network.
I managed to get this working. The key thing here is the environment.retain(operation.root); which will retain the objects in the cache.
Then in the QueryRenderer use the fetchPolicy="store-and-network".
See my full Relay Environment file below.
import {Environment, Network, RecordSource, Store} from 'relay-runtime';
function fetchQuery(operation, variables) {
const environment = RelayEnvironment.getInstance();
environment.retain(operation.root);
return fetch(process.env.GRAPHQL_ENDPOINT, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
credentials: 'include',
body: JSON.stringify({
query: operation.text,
variables
})
}).then(response => {
return response.json();
});
}
const RelayEnvironment = (function() {
let instance;
function createInstance() {
return new Environment({
network: Network.create(fetchQuery),
store: new Store(new RecordSource())
});
}
return {
getInstance: function() {
if (!instance) {
instance = createInstance();
}
return instance;
}
};
})();
export default RelayEnvironment;
Also got this from the Relay Slack Channel. Haven't tried it yet.
const store = new Store(new RecordSource());
(store as any).holdGC(); // Disable GC on the store.

What's the right way to implement offline fallback with workbox

I am implementing PWA into my project, I have setted up the serviceworker.js, and I am using workbox.js for cache routing and strategies.
1- I add the offline page to cache on install event, when a user first visit the site:
/**
* Add on install
*/
self.addEventListener('install', (event) => {
const urls = ['/offline/'];
const cacheName = workbox.core.cacheNames.runtime;
event.waitUntil(caches.open(cacheName).then((cache) => cache.addAll(urls)))
});
2- Catch & cache pages with a specific regex, like these:
https://website.com/posts/the-first-post
https://website.com/posts/
https://website.com/articles/
workbox.routing.registerRoute(
new RegExp('/posts|/articles'),
workbox.strategies.staleWhileRevalidate({
cacheName: 'pages-cache'
})
);
3- Catch errors and display the offline page, when there's no internet connection.
/**
* Handling Offline Page fallback
*/
this.addEventListener('fetch', event => {
if (event.request.mode === 'navigate' || (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html'))) {
event.respondWith(
fetch(event.request.url).catch(error => {
// Return the offline page
return caches.match('/offline/');
})
);
}
else{
// Respond with everything else if we can
event.respondWith(caches.match(event.request)
.then(function (response) {
return response || fetch(event.request);
})
);
}
});
Now this is working for me so far if I visit for example: https://website.com/contact-us/ but if I visit any url within the scope I defined earlier for "pages-cache" like https://website.com/articles/231/ this would not return the /offline page since it's not in the user cache, and I would get a regular browser error.
There's an issue in how errors are handled, when there's a specific caching route by workbox.
Is this the best method to apply for offline fallback? how can I catch errors from these paths: '/articles' & '/posts' and display an offline page?
Please refer as well to this answer where there's a different
approach to applying the fallack with workbox, I tried it as well same
results. Not sure which is the accurate approach for this.
I found a way to do it right with workbox.
For each route I would add a fallback method like this:
const offlinePage = '/offline/';
/**
* Pages to cache
*/
workbox.routing.registerRoute(/\/posts.|\/articles/,
async ({event}) => {
try {
return await workbox.strategies.staleWhileRevalidate({
cacheName: 'cache-pages'
}).handle({event});
} catch (error) {
return caches.match(offlinePage);
}
}
);
In case of using network first strategy this is the method:
/**
* Pages to cache (networkFirst)
*/
var networkFirst = workbox.strategies.networkFirst({
cacheName: 'cache-pages'
});
const customHandler = async (args) => {
try {
const response = await networkFirst.handle(args);
return response || await caches.match(offlinePage);
} catch (error) {
return await caches.match(offlinePage);
}
};
workbox.routing.registerRoute(
/\/posts.|\/articles/,
customHandler
);
More details at workbox documentation here: Provide a fallback response to a route

workbox offline response from IDB instead of cache

I am building an vueJs application with a service worker. I decided to use Workbox with an InjestManifest method to had my own routes.
on fetch when online :
1- answer with the network
2- wrtting body to IDB (through localforage)
3- send back the response
here everything is working perfectly, the sw intercepts the fetch and come back with an appropirate response, IDB contains rigth details.
response sent back to fecth when online:
Response {type: "cors", url: "http://localhost:3005/api/events", redirected: false, status: 200, ok: true, …}
the issue is when I go offline.
my intention id to connect to Locaforage and retrieve the content and build a response.
The issue is that this response is not considered as appropriate by Fetch who then reject it. Console.log confirms that the .catch in sw is working but it looks like the response it sends is rejected.
here is the console.log of the response I am sending back to fetch when offline;
Response {type: "default", url: "", redirected: false, status: 200, ok: true, …}
I do not know if fetch is not happy becasue the url of the repsonse is not the same as on the request but workbox is supposed to allow responding with other resposnes than the ones coming from cache or fetch.
here is the code
importScripts('localforage.min.js')
localforage.config({
name: 'Asso-corse'
})
workbox.skipWaiting()
workbox.clientsClaim()
workbox.routing.registerRoute(
new RegExp('https://fonts.(?:googleapis|gstatic).com/(.*)'),
workbox.strategies.cacheFirst({
cacheName: 'googleapis',
plugins: [
new workbox.expiration.Plugin({
maxEntries: 30
})
]
})
)
workbox.routing.registerRoute( new RegExp('http://localhost:3005/api/'), function (event) {
fetch(event.url)
.then((response) => {
var cloneRes = response.clone()
console.log(cloneRes)
cloneRes.json()
.then((body) => {
localforage.setItem(event.url.pathname, body)
})
return response
})
.catch(function (error) {
console.warn(`Constructing a fallback response, due to an error while fetching the real response:, ${error}`)
localforage.getItem(event.url.pathname)
.then((res) => {
let payload = new Response(JSON.stringify(res), { "status" : 200 ,
"statusText" : "MyCustomResponse!" })
console.log(payload)
return payload
})
})
})
workbox.precaching.precacheAndRoute(self.__precacheManifest || [])
I am really stuck there as all documentation on workbox relates to leveraging cache. I am leveraging localforage as it supports promises which is what is required to make offline capability working.
Thanks
Your catch() handler needs to return either a Response object, or a promise for a Response object.
Adjusting the formatting of your sample code a bit, you're currently doing:
.catch(function (error) {
console.warn(`Constructing a fallback response, due to an error while fetching the real response:, ${error}`)
localforage.getItem(event.url.pathname).then((res) => {
let payload = new Response(JSON.stringify(res), { "status" : 200 , "statusText" : "MyCustomResponse!" })
console.log(payload)
return payload
})
})
Based on that formatting, I think it's clearer that you're not returning either a Response or a promise for a Response from within your catch() handler—you're not returning anything at all.
Adding in a return before your localforage.getItem(...) statement should take care of that:
.catch(function (error) {
console.warn(`Constructing a fallback response, due to an error while fetching the real response:, ${error}`)
return localforage.getItem(event.url.pathname).then((res) => {
let payload = new Response(JSON.stringify(res), { "status" : 200 , "statusText" : "MyCustomResponse!" })
console.log(payload)
return payload
})
})
But, as mentioned in the comments to your original question, I don't think that using IndexedDB to store this type of URL-addressable data is necessary. You can just rely on the Cache Storage API, which Workbox will happily use by default, when storing and retrieving JSON data obtained from an HTTP API.

Service Worker caching not recognizing timeout as a function

I was watching Steve Sanderson's NDC presentation on up-and-coming web features, and saw his caching example as a prime candidate for an application I am developing. I couldn't find the code, so I have typed it up off the Youtube video as well as I could.
Unfortunately it doesn't work in Chrome (which is also what he is using in the demo) It fails with Uncaught TypeError: fetch(...).then(...).timeout is not a function
at self.addEventListener.event.
I trawled through Steve's Github, and found no trace of this, nor could I find anything on the NDC Conference page
//inspiration:
// https://www.youtube.com/watch?v=MiLAE6HMr10
//self.importScripts('scripts/util.js');
console.log('Service Worker script running');
self.addEventListener('install', event => {
console.log('WORKER: installing');
const urlsToCache = ['/ServiceWorkerExperiment/', '/ServiceWorkerExperiment/scripts/page.js'];
caches.delete('mycache');
event.waitUntil(
caches.open('mycache')
.then(cache => cache.addAll(urlsToCache))
.then(_ => self.skipWaiting())
);
});
self.addEventListener('fetch', event => {
console.log(`WORKER: Intercepted request for ${event.request.url}`);
if (event.request.method !== 'GET') {
return;
}
event.respondWith(
fetch(event.request)
.then(networkResponse => {
console.log(`WORKER: Updating cached data for ${event.request.url}`);
var responseClone = networkResponse.clone();
caches.open('mycache').then(cache => cache.put(event.request, responseClone));
return networkResponse;
})
//if network fails or is too slow, return cached data
//reference for this code: https://youtu.be/MiLAE6HMr10?t=1003
.timeout(200)
.catch(_ => {
console.log(`WORKER: Serving ${event.request.url} from CACHE`);
return caches.match(event.request);
})
);
});
As far as I read the fetch() documentation, there is no timeout function, so my assumption is that the timeout function is added in the util.js which is never shown in the presentation... can anyone confirm this? and does anyone have an Idea about how this is implemented?
Future:
It's coming.
According to Jake Archibald's comment on whatwg/fetch the future syntax will be:
Using the abort syntax, you'll be able to do:
const controller = new AbortController();
const signal = controller.signal;
const fetchPromise = fetch(url, {signal});
// 5 second timeout:
const timeoutId = setTimeout(() => controller.abort(), 5000);
const response = await fetchPromise;
// …
If you only wanted to timeout the request, not the response, add:
clearTimeout(timeoutId);
// …
And from another comment:
Edge & Firefox are already implementing. Chrome will start shortly.
Now:
If you want to try the solution that works now, the most sensible way is to use this module.
It allows you to use syntax like:
return fetch('/path', {timeout: 500}).then(function() {
// successful fetch
}).catch(function(error) {
// network request failed / timeout
})

dart - Write body of HttpClientReques failing?

I was trying to send data to a local server using HttpClient. However, the data is never added to the request, I'm using this code:
new HttpClient().put('127.0.0.1', 4040, '/employees/1').then((request) {
request.cookies.add(new Cookie('DARTSESSID',sessionId)..path = '/');
request.headers.add(HttpHeaders.ACCEPT_ENCODING, "");
request.headers.add(HttpHeaders.CONTENT_TYPE, "text/json");
request.write('{"id": 1, "name": "luis"}');
print(request.contentLength);
return request.close();
}).then(expectAsync((HttpClientResponse response) {
expect(response.statusCode, 200);
UTF8.decodeStream(response).then(expectAsync((body) {
expect(body, equals('"employee: 1"'));
}));
}));
but that always prints that the request.contentLenght is -1. I saw those links before without luck:
https://code.google.com/p/dart/issues/detail?id=13293
dart - HttpClientRequest failing on adding data
https://code.google.com/p/dart/issues/detail?id=10026
A ContentLength of -1 does not mean that there is no data, it means that the length of the content is unknown and that a streaming content mode is used - for HTTP 1.1, this will usually mean Chunked ContentEncoding.
I've tried to insert your code in a setup including a server, but without the unittest stuff:
import 'dart:convert';
import 'dart:io';
void main() {
HttpServer.bind('127.0.0.1', 4040).then((server) {
server.listen((request) {
UTF8.decodeStream(request).then((body) {
print(body);
request.response.close();
});
});
new HttpClient().put('127.0.0.1', 4040, '/employees/1').then((request) {
request.cookies.add(new Cookie('DARTSESSID', "1")..path = '/');
request.headers.add(HttpHeaders.ACCEPT_ENCODING, "");
request.headers.add(HttpHeaders.CONTENT_TYPE, "text/json");
request.write('{"id": 1, "name": "luis"}');
print(request.contentLength);
return request.close();
}).then((HttpClientResponse response) {
UTF8.decodeStream(response).then((body) {
print(body);
});
});
});
}
When I run the code, I get
-1
{"id": 1, "name": "luis"}
as expected. Perhaps the problem you are having are on the server?
Writing to the request is an asynchronous operation. Just because the contentLength says that it still is -1 doesn't mean that the data isn't added to the request before sending it to the server.
Also: the content-length is not supposed to update whenever you add new data. It is the value that is sent to the server. -1 means that you don't know the size yet.
I'm not sure, if the library automatically updates it, if it knows the size, but it doesn't need to.

Resources