ServiceWorker does not intercept script request from dynamically loaded page - service-worker

I have implemented service worker with .net core web application. app is single page and load other pages via jquery load function.
The main page has cache which has all the resources needed for application to run completely offline.
So far so good, scope of worker is set to / level. All works until I notice that script that are on dynamically loaded page or partial view does not served by worker although they are in cache. Service worker do serve them on main page.
This gives me impression that whatever the scope of service worker is it will not serve cache on dynamically loaded partial pages but only on page where it’s activated on.
How can I make main page serve cache on those dynamically loaded partial views? How service worker decided where to serve resource from cache? If scope is set to root then why not serve script?
I have read so many documents about it but no where I read that whatever the scope of service worker is, it will only serve on page it’s activated on, no matter if you load page dynamically without real page navigation.
EDIT:
Found it. Probably it will save someone some time & headache. The partial page load is via jquery load function, which needs to parse html and script synchronously and the fetch doesn't intercept if request is synchronous. Now need to find a workaround.

Related

Using Service Worker - MVC view that uses fetched data when online

I am using the Service Worker example on this page. I have implemented the service worker (ASP.NET MVC application running within an SSL localhost site) and I receive confirmation (console message) that it is registered.
Within the URL's to cache I have added
an index page (MyController/Index) which performs a "fetch" (part of the Fetch API)
as well as the URL that is fetch'd (MyController/GetData) which goes
and gets the data from the database
All looks OK when I am online - i.e. all pages I have specified in the URLs to cache are all successfully retrieved (as per the Chrome|DevTools|Network tab).
When I go offline (via the Chrome|DevTools|Application|Service-Workers|Offline checkbox), if I go to pages that I have not listed in the URL's to cache, I get the "Offline" page I have specified in the service worker (which is correct). However when I navigate to the Index page (MyController/Index - mentioned above) which I have listed in the URL's to cache, the view appears however the "fetch" (to MyController/GetData) on that page fails - I wondered what the expected result of this is.
I was of the assumption that the data (retrieved through MyController/GetData) would be cached and when I go offline that cached data would be substituted if the fetch failed (i.e. the NetworkFirst strategy)
Can anyone point me in the direction of what should occur and ideally an MVC example
Justin was right - my MyController/GetData method was actually using a 'Post' method and in my fetch handler it was attempting to always fetch non-Get requests from the network - hence the failure when offline. I just changed the request to a Get request and my service worker fallback logic got the data from cache

How does Chrome load `vue.min.js` from ServiceWorker

when visit the https://vuejs.org/js/vue.min.js, i find it's loaded from ServiceWorker, how it works?
Screenshot of Chrome DevTool
A service worker is a piece of JavaScript code that your browser runs in the background. Service workers are given some special privileges over normal JavaScript code that runs in a browser, with a commonly used priveledge being their ability to intercept fetch events.
A fetch event is activated anytime the client requests a file. Many service workers will download all requested files into a cache. The second time the same file is requested, the service worker will step in and return the cached file without sending any http request.
In this particular case, a service worker is storing the contents of the vue file into a cache the first time it is downloaded, removing the need for an expensive network request on future page loads.

How to properly serve a loading page while server processes the request

I have a web application that sometimes gets a bit heavy and takes a little while to load.
I would like to serve a loading page while the page that the user accessed is loaded on the server. Now, because this is not ajax, or a response to an event, I'm not really sure how to proceed here.
I came up with a rather ugly alternative that works like this:
1: user accesses www.myapp.com/heavypage.
2: if request comes from myapp.com/loading, then serve the myapp.com/heavypage.
else, if request comes from anywhere else, rediret to myapp.com/loading.
The page myapp.com/loading is basically a blue screen with a loading gif that fires a redirect upon loaded: onload="redirectToHeavyPage()".
3: while the server processes the redirect (which takes time), the user is
seeing a pretty loading page.
This way, I was able to show some information to the user while the heavypage action is processed on the server.
It works, but I feel like it is a totally wrong way of doing this, even though it works exactly I expected, mainly on slow connections (like gprs). Keep in mind that I can't put the loading gif anywhere on the heavypage because it will only be served when the server is done processing everything.
What would be the proper way of doing this?
While your scheme works, a cleaner way to approach this is the following:
When the browser requests /heavypage only the HTML shell with a loading animation that does not require any processing or database queries is returned to the client. Preferably this step is be skipped completely by caching the HTML in the browser, a CDN or a reverser-proxy cache.
In the HTML, asynchronously load the expensive HTML via JavaScript. You do not need to wait for the onload event but can trigger this directly through an inline script tag.
In the response callback render the received inner HTML to the target element, e.g. the body.
This scheme works irrespective of whether you are using a single-page application framework like React or Angular or classic server-side rendering. The only difference is if 2. ships an HTML snippet or some JSON that is rendered client-side.
If you are using HTTP/2, there is another slightly more efficient solution to this using server push. When 1. is received on the server (or the CDN) the computationally expensive data can be shipped without waiting for the client request using the new push functionality. This is however only necessary, if the round-trip latency is slower than the time your /heavypage takes to render.

How to load external data into the view asynchronously

In a Rails 3.2 app I have a view that is pulling in information from an external API. On slow connections, this severely reduces the page load time and affects user experience.
How can I move this into an asynchronous process so that the rest of the page loads, and the external information is rendered later once it has been fetched and is available.
The external data is large and complex and I don't think is suitable to cache in the database or in a variable.
I'm aware of delayedjob and similar gems, but these seem more suited to queuing database methods rather than in the view.
What other options are available to me?
It seems like a large data set is perfectly suitable for caching on your local server.
Keep in mind, a long request is going to lock your Rails process/thread and and can't serve any other requests while waiting for your API call to finish.
That said, you can always trigger an Ajax request to occur once the rest of the page loads.

Authenticated user and multiple requests (IIS7 MVC3)

This is one of those questions that maybe should go so serverfault, but then maybe there is a code-level solution.
Anyway, here is the question. I have a regular MVC3 application which requires user login to access (uses the Authorize tag on most of the actions). I also have a Silverlight object within the application that makes HTTP GET calls to a controller action which returns an image (in fact this is a map tile). This particular controller action has no authorize tag, and is therefore public.
The Silverlight component runs slow or just blocks, because the MVC application can apparently process only ONE request at a time, as confirmed by firebug. This means that the map tiles can be served only one after the other. Moreover, regular (non-map-related) requests are enqueued too, and everything times out after a while.
So to make a test, I setup another website with the same document root, and I instructed the Silverlight component to read tiles from there. Now tiles ARE requested concurrently and it runs smoothly.
So, is there any way to resolve this situation and use one site only?
If you are using Session on the server action that would explain why requests are queued. Because the Session is not thread safe ASP.NET serializes all requests from the same session and executes them sequentially.

Resources