Disable relayjs garbage collection - relayjs

Is there a way to disable the relayjs garbage collection (version 5.0.0 or 6.0.0)?
We are still using relayjs classic and it caches all data in a session. This makes loading previous pages fast while fetching new data. In relayjs 5.0.0 they have a dataFrom on the QueryRenderer that can be set to "STORE_THEN_NETWORK" which will try the relay cache store first and fetch from the network, just like rejay classic. Except that the newer versions of relay uses a garbage collection feature to remove data that is not currently used. This makes almost all pages fetch data from the network.

I managed to get this working. The key thing here is the environment.retain(operation.root); which will retain the objects in the cache.
Then in the QueryRenderer use the fetchPolicy="store-and-network".
See my full Relay Environment file below.
import {Environment, Network, RecordSource, Store} from 'relay-runtime';
function fetchQuery(operation, variables) {
const environment = RelayEnvironment.getInstance();
environment.retain(operation.root);
return fetch(process.env.GRAPHQL_ENDPOINT, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
credentials: 'include',
body: JSON.stringify({
query: operation.text,
variables
})
}).then(response => {
return response.json();
});
}
const RelayEnvironment = (function() {
let instance;
function createInstance() {
return new Environment({
network: Network.create(fetchQuery),
store: new Store(new RecordSource())
});
}
return {
getInstance: function() {
if (!instance) {
instance = createInstance();
}
return instance;
}
};
})();
export default RelayEnvironment;
Also got this from the Relay Slack Channel. Haven't tried it yet.
const store = new Store(new RecordSource());
(store as any).holdGC(); // Disable GC on the store.

Related

How to prepare and hydrate Relay state?

I want to fetch queries server-side and then hydrate relayjs Store when the client-side application loads. Relay documentation currently does not mention what are the building blocks required to build and hydrate its state.
We need three things:
create Relay store
populate Relay store with the desired state
serialize Relay store and pass it to the client
A Relay store is handled by Relay Environment. This means that in order for us to populate Relay store with queries, we have to execute them in the context of a Relay Environment, e.g.
import { Environment, Network, RecordSource, Store } from 'relay-runtime';
export const createRelayEnvironment = (apiUrl: string): Environment => {
const recordSource = new RecordSource();
const store = new Store(recordSource);
const network = Network.create((operation, variables) => {
return fetch(apiUrl, {
body: JSON.stringify({
query: operation.text,
variables,
}),
headers: {
'content-Type': 'application/json',
},
method: 'POST',
}).then((response) => response.json());
});
return new Environment({
handlerProvider: null,
network,
store,
});
};
Then all you need to do is run a query and serialize the Relay Store, e.g.
const relay = createRelayEnvironment('https://contra.com/api/');
const appPreloadResponse = await fetchQuery<AppPreloadQuery>(
relay,
appPreloadQuery,
{},
{
fetchPolicy: 'store-or-network',
}
).toPromise();
const htmlHead = `
<script>
window.RELAY_RECORD_MAP = ${stringify(relay.getStore().getSource().toJSON())};
</script>
`;
On the client-side, you simply need to create Relay Environment using the earlier serialized state, e.g.
export const RelayEnvironment = new Environment({
network: Network.create(fetchQuery, subscriptionHandler),
store: new Store(
new RecordSource(window.RELAY_RECORD_MAP),
),
});
And that's it!

Save in Cache api response WORKBOX

I would like to save some data that comes from an API in cache in case I lose the connection that data is shown
I have a list of Work Parts, so if I go offline I would like to continue seeing those parts, since when I enter the component again, it makes the call to the API and brings them again and since there is no connection, it leaves it in white here would be to bring them from cache
import {precacheAndRoute} from 'workbox-precaching';
import {clientsClaim, skipWaiting} from 'workbox-core';
import {registerRoute} from 'workbox-routing';
import {CacheFirst, NetworkFirst, NetworkOnly, StaleWhileRevalidate} from 'workbox-strategies';
import {CacheableResponsePlugin} from "workbox-cacheable-response";
import {BackgroundSyncPlugin} from 'workbox-background-sync';
import {Queue} from 'workbox-background-sync';
declare const self: ServiceWorkerGlobalScope;
skipWaiting();
clientsClaim();
const queue = new Queue('cola');
const bgSyncPlugin = new BackgroundSyncPlugin('api-cola', {
onSync: async ({queue}) => {
let entry;
while ((entry = await queue.shiftRequest())) {
try {
let clone = entry.request.clone();
await fetch(clone);
console.error("Replay successful for request", entry.request);
} catch (error) {
console.error("Replay failed for request", entry.request, error);
// Put the entry back in the queue and re-throw the error:
await queue.unshiftRequest(entry);
throw error;
}
}
console.log("Replay complete!");
}
});
registerRoute(
/\/api\/.*\/*.php/,
new NetworkOnly({
plugins: [bgSyncPlugin]
}),
'POST'
);
registerRoute(
({url}) => url.origin === 'https://xxx.xxx.com' &&
url.pathname.startsWith('/api/'),
new CacheFirst({
cacheName: 'api-cache',
plugins: [
new CacheableResponsePlugin({
statuses: [0, 200, 404],
})
]
})
);
registerRoute(
/assets\/images\/icons\/icon-.+\.png$/,
new CacheFirst({
cacheName: 'icons'
})
);
precacheAndRoute(self.__WB_MANIFEST);
When you go offline when you return the connection a sync is done and this works fine.
I noticed that you are using runtime caching, which means the app or the user would have to make the calls to the api at some point before they are offline so that the resources are available in the cache. Maybe warm strategy cache would work for you, if you know the urls before hand.
I also noticed that you are caching responses even if they return with a 404 code, so those would also not be displayed correctly.

How to store a list of querys and send them to mySQL when online?

I'm trying to build a webapp that will work offline.
I found JS Service worker and i have now implemented it in my app to store some static pages.
Now i'd like to build a HTML FORM where the user fills in stuff that will be saved to the cache if the user is offline.. but directly sends to mysql when the user is online.
It will be like a list with querys that will execute when user comes online.
How can i save a query string to the cache, and then check if online and send it with Ajax? to php and mySQL.
First off, how do i save a query string to the cache?
Second.. how do i find out when online and then fetch the query string from the cache?
This is what i got to cache my pages:
importScripts('cache-polyfill.js');
self.addEventListener('install', function(e) {
e.waitUntil(
caches.open('offlineList').then(function(cache) {
return cache.addAll([
'/app/offline_content/'
]);
})
);
});
self.addEventListener('fetch', function(event) {
console.log(event.request.url);
event.respondWith(
caches.match(event.request).then(function(response) {
return response || fetch(event.request);
})
);
});
EDIT
I'm now reading up on HTML/JS "localStorage"..
Well i solved this by adding data to localStorage:
//ADD DATA
localStorage.setItem(key, JSON.stringify(value));
Then i check for internet connection:
//CHECK INTERNET CONNECTION
const checkOnlineStatus = async () => { //console.log('CHECKING INTERNET..');
try {
const online = await fetch("/img.gif");
return online.status >= 200 && online.status < 300; // either true or false
} catch (err) {
return false; // definitely offline
}
};
const result = await checkOnlineStatus();
result ? updateMysql() : console.log('NO INTERNET..');
In the updateMysql() function i load all the localStorage and send it with ajax to php and mySQL.
var query = JSON.parse(localStorage.getItem(key));

Falcor Router should return the value from external API

I am new to JavaScript frameworks and currently trying to setup a falcor router calling an external api (for now consider it as an express api app + mango db, hosted at 3000 port).
Now, I am able to use the request package (commented out lines) and successfully call the Express Api app (which returns obj.rating = 4). But I am unable to send this value from the falcor router instead of the hard-coded value "5".
Below is the falcor-router's server.js code:
app.use('/rating.json', falcorExpress.dataSourceRoute(function (req, res) {
return new Router([
{
route: "rating",
get: function() {
var obj;
// request('http://localhost:3000/rating/101', function (error, response, body) {
// obj = JSON.parse(body);
// console.log('rating:', obj.rating); // obj.rating = 4
// });
return {path:["rating"], value:"5"};
}
}
]);
}));
The below is the code for index.html:
<script>
function showRating() {
var model = new falcor.Model({source: new falcor.HttpDataSource('http://localhost/rating.json') });
model.
get("rating").
then(function(response) {
document.getElementById('filmRating').innerText = JSON.stringify(response.json,null, 4);
});
}
</script>
I also tried to look at the global variable declaration, synchronize http request calls, promises, then statements etc. But nothing seemed to work, clearly I am missing out something here - not sure what.
The router's get handler expects the return value to be a promise or an observable that resolves to a pathValue. To get your request against the db to work, simply return a promise that resolves to a pathValue, e.g.
return new Router([
{
route: "rating",
get: function() {
return request('http://localhost:3000/rating/101', function (error, response, body) {
return { path: ["rating", value: JSON.parse(body).rating };
});
}
}
]);

Service Worker, double caching?

Im having trouble with my Service Worker. I have implemented it with the Cache then Network technique, where content is first fetched from cache, and a network-fetch is always performed and the result is cached at success. (Inspired by this solution, CSS-Tricks)
When I make changes to my web app and hit refresh, I of course, the first time get the old content. But on subsequent refreshes the content alternates between old and new. I can get new or old content five times in a row or it could differ on each request.
I have been debugging the Service Worker for a while now and does not get any wiser. Does anyone have an idea about whats wrong with the implementation?
EDIT:
var version = 'v1::2';
self.addEventListener("install", function (event) {
event.waitUntil(
caches
.open(version + 'fundamentals')
.then(function (cache) {
return cache.addAll([
"/"
]);
})
);
});
self.addEventListener("fetch", function (event) {
if (event.request.method !== 'GET') {
return;
}
event.respondWith(
caches
.match(event.request)
.then(function (cached) {
var networked = fetch(event.request)
.then(fetchedFromNetwork, unableToResolve)
.catch(unableToResolve);
return cached || networked;
function fetchedFromNetwork(response) {
var cacheCopy = response.clone();
caches
.open(version + 'pages')
.then(function add(cache) {
cache.put(event.request, cacheCopy);
});
return response;
}
function unableToResolve() {
return new Response('<h1>Service Unavailable</h1>', {
status: 503,
statusText: 'Service Unavailable',
headers: new Headers({
'Content-Type': 'text/html'
})
});
}
})
);
});
self.addEventListener("activate", function (event) {
event.waitUntil(
caches
.keys()
.then(function (keys) {
return Promise.all(
keys
.filter(function (key) {
return !key.startsWith(version);
})
.map(function (key) {
return caches.delete(key);
})
);
})
);
});
I don't see how you are setting the version, but I presume multiple caches still exist (I can see you are trying to delete the previous caches but still). caches.match() is a convenience method and the order is not guaranteed (at least Chrome seems to query the oldest one first). Chrome Developer Tools shows you the existing caches (Application/Cache/Cache Storage) and their contents. If you want to query a specific cache, you'll need to do:
caches.open(currentCacheName).then(function(cache) {...}
as in the example in the Cache documentation.

Resources