This may be a silly question, but my PWA has the index page cached for 6 months, yet after 1 month it did a full refresh (I know this because my PWA uses logged in elements which aren't available after the install so it throws an error when you use it).
This makes me think that cache.addAll has a default cache time of 1 month? Is that correct? Can this be extended or is it just a bug in my code?
By default items are not updated nor deleted. There's no automatic cache deletion logic anywhere. One thing to note, though, is that the browser itself may purge items from the cache if it so desires. This could happen eg. to free space on the devices.
So either you have a bug somewhere or the browser purged your cache.
More: https://developer.mozilla.org/en-US/docs/Web/API/Cache
Edit: What abraham in the comments is referring to is this: https://webkit.org/blog/10218/full-third-party-cookie-blocking-and-more/
Related
I have a page which renders a lot of partials.
I fragment cache them all, which makes it very fast. Horray!
The thing is, that because of the amount of partials, the first run, when writing the cache, takes so long, the request timeout (but the other times are really fast)
I also use sidekiq (but the question is relevant to any background processor)
Is there a way to save those partials in a background process so users that miss the cache (due to expiration) won't have a timeout? So I would go over all partials, and those of which the cache expired (or is going to expire soon) I will recache them?
I only know of preheat gem, but I think it is still not complex enough for my need. Plus it hasn't been maintained for ages.
I was working on some project and had similar problem. Actually it was problem with only what page and problem with loading right after cleaning the cache. I solved it on another way (I didn't have anything like sidekiq, so maybe it will not be right solution for you, but maybe will be helpful)
What I did, is that right after cleaning the cache a called open() method and put the problematic url as parameter like:
open('http://my-website/some-url')
so, after cleaning the cache, that url was being called and it creates a new cache automatically. We solved that problem quickly on that way. I know that the some background workers would be better solutions, but for me it worked.
Just to say, that our cache was cleaning by the cron, not manually.
UPDATE
or maybe if you want do clean the cache manually, you can after cleaning the cache call open('http://my-website/some-url') but using the sidekiq (I didn't try this, it's only idea).
Of course, my problem was with only one page, but if you want whole website, it makes things complicated.
One of my programs uses InternetOpenURL without the INTERNET_FLAG_RELOAD specified. I see that the call will read the web page from the cache (if it exists). I changed the web page 5 days ago and I still get the cached version (unless I specify the flag above.) Does anyone know when or if the cache will ever be updated? I'd expect it to eventually be refreshed (possibly overly optimistic!)
Chances are the original resource had an expiration date on it that is more than 5 days, that is why the cached version keeps being used if you do not tell InternetOpenURL() to re-check the server once in awhile. However, INTERNET_FLAG_RELOAD forces it to re-download the resource whether it has actually been modified or not. That might be a little too brute force for your needs. Try using INTERNET_FLAG_RESYNCHRONIZE instead. That allows InternetOpenURL() to check if the resource has been modified and then re-download it only if needed, otherwise keep using the cached copy.
We've got a strange problem that is very hard to troubleshoot. We are looking for some assistance on methods that might help us troubleshoot this problem. We use memcache and thinksphinx. Recently we moved to a new server and suddenly elements on the pages are showing up missing.
So for instance, our home page has news items and latest files added. In one case I see that we are missing the last 2 news items. My developer checks and sees its there. 10 minutes later he checks and see all the news items missing. Check again 15 minutes later and missing 3 items.
We were able to notice that on the server move we had memcache set at 2mb, so we moved it up to 1gb. It looked like everything was fixed. However, now we are seeing similar inconsistencies when people are searching. Users will report problems, I will see them, send them to my developer and he sees different results. We both refresh and see something else.
We are able to realize this is somehow related to memcache and/or our thinkingsphinx, because when we clear and rebuild, everything acts normal.
My only assumption is that at some point we run out of memory in memcache, but it makes no sense that only certain data would not be shown.
Can anyone give any advice?
Thanks,
Will
I have been neglecting learning about caching for quite some time now, and although I've used caching here and there in the past it's not something I'm familiar with.
I found a great tutorial about what caching is and what kinds of cache there are (I know already what caching is), but...
How does one decide what and when to cache? Are there things that should always be cached? On what situations should you never use caching?
First rule is: Don't cache until you need it, that would be premature optimization (first link I found, google for more info)
The biggest problem with caching is invalidation of cache. What happens when the data you have cached is being updated. You need to make sure your cache is updated as well and if not done correctly often becomes a mess.
I would:
Build the application without
caching and make sure the
functionality works as intended
Do some performance testing, and
apply caching when needed
After applying caching do
performance testing again to check
that you are getting the expected speed increase
I think the easiest way is to ask yourself a bunch of questions,
Is this result ever going to change?
No? then cache it permanently
Yes, When is it going to change? When a user updates something.
Is it going to impact only the particular user who changed the value or all of the users. This should give you an indication of when to clear the particular cache.
You can keep on going, but after awhile you will end up with different profiles
UserCache, GlobalCache just being 2 examples.
These profiles should be able to tell you what to cache and have a certain update criteria (When to refresh the cache)
I've been using System.Web.Cache for a while for testing purpose. It's quite a nice cache store and speed up my webpage quite a lot. But i don't know there are some case, in which i run for a few more more page, and when I turn back to that page few more time, the page query again ( I checked using a profiler ).
Does System.web.cache cache in Ram or some type of flash memory that make the cache go off once in a while when it's low on memory? or is it my mistake somewhere? Is it good to use System.web.cache for production?
best wishes to you :)
The cache will automatically start removing items when your system runs low on memory, the items it picks can be controlled to some degree by the priority you give them when you insert them into the cache. Use the CacheItemPriority enum to specify the priority in the Cache.Add() method. Yes the cache is fine for production, whether it is good or not for your specific implementation only you can tell.
The other issue to watch for is when the IIS application pool gets recycled.
Yes, ASP.NET cache is perfectly fine for production use (however, consider Velocity if you have a web farm). And yes, it does automatically remove items based on memory, item priority & other "metrics".