I would like to implement the on network response caching strategy. But once the cache populates, it will never update the cache again. So I was thinking if I have to clear the cache manually for this strategy to work.
They say this strategy is ideal for frequently updating resources such as a user's inbox, or article contents. Also useful for non-essential content such as avatars, but care is needed.
My content is never updated unless I clear the cache. Is something wrong in my understanding?
Related
I'm learning Workbox and I want to add some articles URLs to cache for X amount of days and I don't know how to do it.
I can handle URLs that I know using precacheAndRoute.
Example:
precacheAndRoute([
{url: '/index.html', revision: '...'},
{url: '/contact.html', revision: '...'},
])
Now, I want to add some URLs that I don't know the path to cache on demand. This's because my project is a blog and each post has his own path.
My proposed scenario is:
A user enters the article, and that article is cached for 30 days, so you can view offline later.
What you're after is called runtime caching. It works as you describe: content is cached as the user navigates through the website. Afterwards the content is available for offline viewing.
Runtime caching maybe be implemented with different strategies. They can eg. accept data only from the cache, from cache or network depending on speed, first cache and update in the background etc. Multiple different strategies which may even be manually configured to fit your needs.
Reading: https://developers.google.com/web/tools/workbox/modules/workbox-strategies#what_are_workbox_strategies, https://developers.google.com/web/fundamentals/instant-and-offline/offline-cookbook, https://web.dev/runtime-caching-with-workbox/
Advice: before implementing anything READ A LOT. That way you can grasp the concepts before you try anything. It might also be that you find something you never thought about in the beginning.
Here is the scenario:
You have a site that currently cached via a SW. You deploy a new version that includes an updated SW with a cache busting version. The company then announces the new features. People visit the site, however, even though the SW busts it still serves up the previous cache while updating its cache in the background. So visitors that come for the new features don't see them.
Is this the expected experience with ServiceWorkers? What are the recommended strategies to get around this?
It's the expected behavior whenever you serve resources with a cache-first strategy, yes.
There are two options:
Don't use a cache-first strategy. Unfortunately, you lose out on most of the performance benefits of service workers if you use a network-first strategy. I wouldn't recommend going network-first if you can help it.
Adopt the UX pattern of displaying a "Reload for the latest updates" toast message on the screen letting the user know that the cached content has been refreshed, and allowing them to take action to see the latest content. This is, I think, the best approach. If you're using a service worker which gets updated whenever your cached content changes (e.g., one generated by sw-precache), then you can detect these updates by listening for specific service worker controller events, and use those to trigger the message. (Here's an example.)
I have an MVC4 website and I'm using the OutputCache to cache the result of a view that displays multiple pages of ranked results. The cached output varies on the Page parameter. The rankings are a function of time, so the results for any given page can be out of sync depending on when they're cached, which is exacerbated by the fact that I'm using an infinite-scroll mechanism where duplicate results can be shown if a result gets pushed to the next page.
The ideal solution to this problem would be to cache some reasonable number of pages all at once. This would require being able to check if the cached output is expired, re-generate cached results if they are expired and then return the cached response. Is this possible?
I should also note that I'm using OutputCaching with the Azure Output caching provider, and I have a dedicated caching role (Note: not the shared caching service).
Any help would be greatly appreciated.
This would require being able to check if the cached output is
expired, re-generate cached results if they are expired and then
return the cached response. Is this possible?
This is exactly how OutputCaching works - request a page, if it exists in cache and isn't expired, retrieve it from cache, otherwise, render page and update the cache.
If the data really is this dynamic, you are probably causing more work/problems by caching the output without realizing any gains in performance (KISS applies here! Don't create a solution for a problem if you can avoid the problem in the 1st place).
However, to architect a solution as you describe (if really required) could be done with an Azure Queue and a Worker Role. Have your ratings engine stuff a value in the queue when a rating is added/updated. Then, have the Worker Role poll the Queue every second (for example) for values. If a value is found, the have the Worker Role do a web request against the cached page. This will update the output cache if it has expired. However, you are still limited by the cache expiration. unless you do something like from this SO post):
HttpResponse.RemoveOutputCacheItem() is probably the method you want
to use. If you can figure out what name the actions are cached under,
you can remove just the specific action (try setting a breakpoint or
dumping all of the names of cached items to the screen)
I'm displaying a lot of data on a website that won't change often. Because of this, I'm caching the data in HttpRuntime.Cache, which I understand to cache data for all users of the website.
However, I also want to offer the ability to force a refresh in case cache data becomes stale. Since the data is cached for all users, this means that if a few people are using the site at once, it'll affect everyone. Is this a common pattern? It seems like strange behavior for a site to display, especially since one user could slow everyone down by constantly forcing cache refreshes. It still doesn't make sense to do clientside caching since the data will be the same for all users.
Caching data visible to all users is extremely common, and is a good practice. However giving users the ability to refresh the cache is pretty rare. The better path would be to expire your cache when data is saved that would change the contents of a cached page.
Smart cache invalidation means that your users always see the freshest data, but they also get the benefits of your caching efforts. Ideally, you're only expiring the pages affected by a change - not your entire cache.
I think it would be careless of you to allow a normal user to have the ability to invoke a "clear cache" operation.. your cache-able data should have some sort of dependency defined. See: Cache Expiration Policies
I am reading a book about MVC2, and in the OutputCache section it states:
Warning In the earlier section “How Authorization Filters Interact
with Output Caching,” I explained that [Authorize] has special
behavior to ensure that unauthorized visitors can’t obtain sensitive
information just because it’s already cached. However, unless you
specifically prevent it, it’s still possible that cached output could
be delivered to a different authorized user than the one for whom it
was originally generated. One way to prevent that would be to
implement your access control for a particular content item as an
authorization filter (derived from AuthorizeAttribute) instead of
simply enforcing authorization logic inline in an action method,
because AuthorizeAttribute knows how to avoid being bypassed by output
caching. Test carefully to ensure that authorization and output
caching are interacting in the way you expect.
Is this still true in MVC3?
If affirmative, what is the way to prevent that of happening? (because the explanation in the book is too vague).
Regards.
I think it is.
When you are using OutPutCache to cache data, these data are cached globally. As long as a user is authorized, the user will get cached data.
Yes we have "VaryByParam" options for outputcache, but it also creates a new cache for every different parameter passed. which means it's still globally.
So if you want to cache different data based on users, outputcache may not be the right way doing it. If data is user specific, session is the right choice. it's what session lives for