Is there a good way to disable cache for specific domains? For example, anytime I start building a new website can I block just that domain from caching? I would prefer the rest of the internet to be cache-able.
I am currently using Firefox Web Developer Toolbar addon to disable cache, is there any better plugins?
Built-in Firefox Developer Tool has a features to disable cache for tabs where this toolbox is open.
Disable cache: disable the browser cache to simulate first-load performance. From Firefox 33 onwards this setting persists, meaning that if it is set, caching will be disabled whenever you reopen the devtools. Caching is re-enabled when the devtools are closed.
https://developer.mozilla.org/en-US/docs/Tools/Tools_Toolbox
Unfortunatley it's not per domain, but maybe this is better than Web Developer Toolbar.
You can send specific headers from your web application to prevent the browser from caching. You might send these headers only to your ip or browsers where a certain cookie is set.
Return these headers to prevent a browser from caching your content:
Cache-Control: no-cache, must-revalidate
Expires: Sat, 26 Jul 1997 05:00:00 GMT
Expires should be a date in the past.
The Charles Web Debugging Proxy is a good way to disable cache for specific domains. Just go to the Tools menu, then select No Caching. A window will open that lets you specify which locations to prevent caching on.
Charles is a proxy so you can use it to control the caching in all of your web browsers - Firefox, Chrome, IE, whatever you use!
I usually use a rewrite rule allowing /static/${NUMBERS}/directory/file.js to be served from /static/directory/file.js. Large files are treated separately (mp4, zip...). With PHP i set ${NUMBERS} to the UNIX_TIMESTAMP for development, and to VERSION_NUMBER for production. Now development is always downloaded but cdnjs almost never.
Related
I have a simple MVC application hosted internally by IIS 10 on a VM (windows server 2016). The initial page loads in 10 seconds in latest Chrome or Edge version (which is too slow I think) but takes less than 1 second in Firefox and IE. We have https redirect and SSL enabled. When you first click the web link in chrome, it will mostly say "establishing secure connection" but sometimes it says nothing. While its loading, you see the info icon in the browser (where the lock usually is). When you click on the (i) in Chrome for example, it says "Your connection to this site is not secure" then once it loads, this message goes away and you see the lock and that the certificate is valid. Why could this handshake take so much longer in chromium browsers compared to other browsers? Is there a setting? I am just a developer, not very savvy with the robustness of IIS settings (here, this is handled by server admin) nor am I familiar with certificate issues. If someone could point me in the right direction here it would be appreciated. I have spent hours trying various things (changing settings in chrome based on my research such as disabling tls or trying earlier versions of tls in the experimental features). I also tried disabling/enabling an https redirect in the web config and added a registry to my global.asax file. Adding site as whitelist, etc. the list goes on.
Thanks for the info in advance,
Hydee V.
Should any of PWA related resources be served with any kind of cache headers from server or should we move classic http caching out of our way by turning it completely off?
Namely, what should be http cache headers for:
manifest file
Related to it, how does new versions of manifest file (favicon changed for example) get to the client?
service worker js file
(this one is a bit tricky because browsers check for new versions every 24 hours so some caching might be good?)
index.html (entry point for spa)
My understanding was that it should be turned off completely and all the cache should be handled from service worker but there seems to be different infos out there and hard to extract best practices.
There's some guidance at https://web.dev/reliable/http-cache, along with a number of other resources on the web.
In general, building a PWA and introducing a service worker doesn't change the best practices that you should follow for your HTTP caching.
For assets that include versioning information in their URL (like /v1.0.0/app.js, or /app.1234abcd.js), and you know that the contents of a given URL won't even change, you should use Cache-Control: max-age=31536000.
For assets that don't include versioning information in their URL (like most HTML documents, and also /manifest.json, if you don't include a hash there), you should set Cache-Control: no-cache along with ETag or Last-Modified, to ensure that a previously cached response is revalidated before being used.
For your service worker file itself, modern browsers will ignore the Cache-Control header value you set by default, so it doesn't really matter. But it's still a best practice to use Cache-Control: no-cache so that older browsers will revalidate it before using it.
We're having a unique issue that is affecting a small handful of users from around the world. Nothing connects them aside from the fact they are all using Chrome for iOS.
Intermittently, users will login to our application (https://www.mousehuntgame.com) and appear to be "someone else". This issue cropped up recently during a period when no new code had been pushed to the site.
Of course the first thing we checked was that our authentication was not bugged or that the user's hash (stored in either cookies or a PHP session) was not crossing connections somewhere. The issue is not in the authentication system, and it only affects users using Chrome for iOS. The same users using Safari no longer see the issue.
We have the following PHP headers being sent to prevent caching:
header("Cache-Control: no-cache, no-store, max-age=0, must-revalidate, private");
header("Pragma: no-cache");
The "target users" that these users "turn into" are not yet confirmed to be also using Chrome. The solution for them to simply stop using the browser is not an option as others who continue to use Chrome can still gain access to these accounts.
Can Chrome be somehow caching cookies and "sharing" them across users? Could this be a DNS issue where it sees a mobile user agent and in order to save loading time it retrieves cached information and hands it off without further checking who the user is? This is a stretch, I know, but it's been a strange issue and we're grasping at straws now.
I work on the Chrome Data Compression proxy.
I'd be very surprised if the Chrome proxy were at fault here, since we respect standard caching headers. That said, there could be a bug. If you can try to reproduce with and without the proxy that would be helpful. Another way to test is to open the page in an Incognito tab (which does not use the proxy).
(Edited)
I looked at some of the headers we are seeing from your site, and they include things like
Cache-Control: max-age=2592000
which means these responses are publicly cacheable for 30 days. I see a wide range of caching headers from many different URLs on the site, suggesting that your caching rules aren't being applied as widely as you thought; but of course I don't know the structure of the site and whether that would lead to the problem you are describing.
Feel free to reach out (email is fine too) and I'm happy to help debug if you still think this is a problem on our end.
Note: Please correct me if any of my assumptions are wrong. I'm not very sure of any of this...
I have been playing around with HTTP caching on Heroku and trying to work out
a nice way to differentiate between mobile and desktop requests when caching using Varnish
on Heroku.
My first idea was that I could set a Vary header so the cache is Varied on If-None-Match. As Rails automatically sends back etags generated from a hash of the content the etag would vary between desktop and mobile requests (different templates) and so it would eventually cache two versions (not fact, just my original thoughts). I have been playing around with this but I don't think it works.
Firstly, I can't wrap my head around when/if anything gets cached as surely requests with If-None-Match will be conditional gets anyway? Secondly, in practice fresh requests (ones without If-None-Match) sometimes receive the mobile site. Is this because the cache doesn't know whether to serve up the mobile or desktop cached version as the If-None-Match header isn't there?
As it probably sounds, I am rather confused. Will this approach work in any way or am I being silly? Also, is there anyway to achieve separate cached versions if I am unable to reach the Varnish config at all (as I am on Heroku)?
The exact code I am using in Rails to set the cache headers is:
response.headers['Cache-Control'] = 'public, max-age=86400'
response.headers['Vary'] = 'If-None-Match'
Edit: I am aware I can use Vary: User-Agent but trying to avoid it if possible due to it have a high miss rate (many, many user agents).
You could try Vary: User-Agent. However you'll have many cached versions of a single page (one for each user agent).
An other solution may be to detect mobile browsers directly in the reverse proxy, set a X-Is-Mobile-Browser client header before the reverse proxy attempts to find a cached page, set a Vary: X-Is-Mobile-Browser on the backend server (so that the reverse proxy will only cache 2 versions of the same page) and replace that header with Vary: User-Agent before sending to client.
If you can not change your varnish configuration, you have to make different urls for mobile and desktop pages. You can add some url-parameter (?mobile=true), add a piece in your path (yourdomain.com/mobile/news) or use a different host (like m.yourdomain.com).
This makes a lot of sense because (I've seen this many times, both in CMSs and applications) at some point in time you want to differentiate content and structure for mobile devices. People just do different things or are looking for different information on mobile devices...
i just ran yslow against my website and i had a question around Expiry Headers: YSlow gave me an : Grade F on Add Expires headers. There are 20 static components without a far-future expiration date. These are all css or js files.
Right now when i go to IIS (6.0), and go to the http headers tab, Enable Content Expiration is NOT checked. from reading this it seems like this is the right thing to do as the browser will then cache the content. So i am confused why yslow is complaining. Also, it sounds like browsers will cache this data by modified date anyway so is this whole thing meaningless ??
So if setting this is a no brainer, why isn't this the default behavior??
Can someone please clarify.
There's no contradiction here. What you need to do is set content expiration on the folders which contain static content. Such as your image, css, and script folders. You can set the content expiration on a folder-basis in IIS and other web servers.
A browser has no idea what content is 'static' or not; it literally has no way of knowing, and yslow is only guessing, most likely. It's probably guessing correctly... but having incorrect Expires values by default in the web server could cause browsers to cache dynamic content that you do not want them caching at all.
That's why it's not set like that by default.