Proper caching is very important for us as we aim to reduce traffic.
I've configured communication for resources as so:
This is the request URL:
http://www.mydomain.com/resources/product/assets/swf/modules/loader.swf?ver=1
These are the response headers:
Accept-Ranges:bytes
Connection:Keep-Alive
Content-Length:14622
Content-Type:application/x-shockwave-flash
Date:Tue, 22 May 2012 09:16:41 GMT
ETag:"7804f-391e-4c08e046d3ec0"
Keep-Alive:timeout=15, max=96
Last-Modified:Mon, 21 May 2012 16:01:39 GMT
P3P:CP="HONK"
Server:Apache
I need to force the browsers to cache the resources I pass back. As far as I know these headers should be sufficient to cache resources. But...
For some resources a strange thing, called gsCacheBuster, comes in:
Request URL:
http://www.mydomain.com/resources/product/assets/images/image1.png
?gsCacheBusterID=1337684498911&purpose=audit&ver=1
Which changes the URL and the resource is therefore not cached.
As you can see we've implemented functionality in the client (which is Flash in the case) to pass a ver parameter along the request to make sure the current version is being downloaded. This way we aim to maintain versioning of resources and force browsers to cache resources with the same version.
I have no idea why this cacheBuster appears and couldn't find any info. It happens in all browsers.
It turns out it is a LoaderMax property, since we use the Greensock library in our Flash client. It can be disabled with setting the setter noCache to false and also remove audits.
Related
I am configuring gzip compression for a server via Jetty, and there are PUT/POST endpoints whose response payloads I would like to compress. The default GzipHandler configuration for Jetty specifically only includes GET; that this is the default is documented, but I'm unable to find documentation as to why this is the default. Is there a downside to applying gzip when the method is non-GET?
The reason comes down to responses from PUT and POST are, in a general sense, not suitable for being put into a cache.
GET was selected as the default back when gzip compression was first introduced, (and back before Jetty moved to Eclipse, back before Servlet 2.0, back when it was called the GzipFilter in Jetty) and in that era if the content couldn't be cached it wasn't compressed.
Why?
Well, back then using system resources to compress the same content over and over and over again was seen as a negative, it was more important to serve many requests vs a few optimized requests.
The GzipHandler can be configured to use any method, even nonsensical ones like HEAD.
Don't let a default value with historical reasons prevent you from using the GzipHandler, use it, configure it, and be happy.
Feel free to file an issue requesting the defaults be changed at https://github.com/eclipse/jetty.project/issues
Should any of PWA related resources be served with any kind of cache headers from server or should we move classic http caching out of our way by turning it completely off?
Namely, what should be http cache headers for:
manifest file
Related to it, how does new versions of manifest file (favicon changed for example) get to the client?
service worker js file
(this one is a bit tricky because browsers check for new versions every 24 hours so some caching might be good?)
index.html (entry point for spa)
My understanding was that it should be turned off completely and all the cache should be handled from service worker but there seems to be different infos out there and hard to extract best practices.
There's some guidance at https://web.dev/reliable/http-cache, along with a number of other resources on the web.
In general, building a PWA and introducing a service worker doesn't change the best practices that you should follow for your HTTP caching.
For assets that include versioning information in their URL (like /v1.0.0/app.js, or /app.1234abcd.js), and you know that the contents of a given URL won't even change, you should use Cache-Control: max-age=31536000.
For assets that don't include versioning information in their URL (like most HTML documents, and also /manifest.json, if you don't include a hash there), you should set Cache-Control: no-cache along with ETag or Last-Modified, to ensure that a previously cached response is revalidated before being used.
For your service worker file itself, modern browsers will ignore the Cache-Control header value you set by default, so it doesn't really matter. But it's still a best practice to use Cache-Control: no-cache so that older browsers will revalidate it before using it.
I am working with asp.net mvc 4 output caching and for some reason I cant get it working. What I am looking to do is standard but the output caching never kicks in. I think it may be due to cookies being changed and therefore not caching. I want output cache on the client. All my site is in https and all requests is under https.
Some questions I have:
how can I set up output caching to ignore any changes in cookies?
what should I look for in the response headers to understand if output caching is working or not?
setting the location to ServerAndClient - am i correct in saying this results in caching on the server and in the clients browser?
Thanks.
Further info:
What I have noticed via fiddler is that the http header request contains Cache-Control: max-age=0 but I have no idea how or why its being set to this.
by default caching attribute ignores the cookies!
you can set breakpoints in your controllers to see if output caching works or not, seems you know where to look! "... contains Cache-Control: max-age=0 but ..."
yes it is correct
and i suggest you to set Duration on the caching filter
look in the web.config file to see if the cacheControlMode attribute is not set to something strange! and if it is s-maxage=0 it is because of shared proxy
set the attribute like this on the controllers that you want to cache
[OutputCache( Duration = 360 )]
Note: Please correct me if any of my assumptions are wrong. I'm not very sure of any of this...
I have been playing around with HTTP caching on Heroku and trying to work out
a nice way to differentiate between mobile and desktop requests when caching using Varnish
on Heroku.
My first idea was that I could set a Vary header so the cache is Varied on If-None-Match. As Rails automatically sends back etags generated from a hash of the content the etag would vary between desktop and mobile requests (different templates) and so it would eventually cache two versions (not fact, just my original thoughts). I have been playing around with this but I don't think it works.
Firstly, I can't wrap my head around when/if anything gets cached as surely requests with If-None-Match will be conditional gets anyway? Secondly, in practice fresh requests (ones without If-None-Match) sometimes receive the mobile site. Is this because the cache doesn't know whether to serve up the mobile or desktop cached version as the If-None-Match header isn't there?
As it probably sounds, I am rather confused. Will this approach work in any way or am I being silly? Also, is there anyway to achieve separate cached versions if I am unable to reach the Varnish config at all (as I am on Heroku)?
The exact code I am using in Rails to set the cache headers is:
response.headers['Cache-Control'] = 'public, max-age=86400'
response.headers['Vary'] = 'If-None-Match'
Edit: I am aware I can use Vary: User-Agent but trying to avoid it if possible due to it have a high miss rate (many, many user agents).
You could try Vary: User-Agent. However you'll have many cached versions of a single page (one for each user agent).
An other solution may be to detect mobile browsers directly in the reverse proxy, set a X-Is-Mobile-Browser client header before the reverse proxy attempts to find a cached page, set a Vary: X-Is-Mobile-Browser on the backend server (so that the reverse proxy will only cache 2 versions of the same page) and replace that header with Vary: User-Agent before sending to client.
If you can not change your varnish configuration, you have to make different urls for mobile and desktop pages. You can add some url-parameter (?mobile=true), add a piece in your path (yourdomain.com/mobile/news) or use a different host (like m.yourdomain.com).
This makes a lot of sense because (I've seen this many times, both in CMSs and applications) at some point in time you want to differentiate content and structure for mobile devices. People just do different things or are looking for different information on mobile devices...
Is there a good way to disable cache for specific domains? For example, anytime I start building a new website can I block just that domain from caching? I would prefer the rest of the internet to be cache-able.
I am currently using Firefox Web Developer Toolbar addon to disable cache, is there any better plugins?
Built-in Firefox Developer Tool has a features to disable cache for tabs where this toolbox is open.
Disable cache: disable the browser cache to simulate first-load performance. From Firefox 33 onwards this setting persists, meaning that if it is set, caching will be disabled whenever you reopen the devtools. Caching is re-enabled when the devtools are closed.
https://developer.mozilla.org/en-US/docs/Tools/Tools_Toolbox
Unfortunatley it's not per domain, but maybe this is better than Web Developer Toolbar.
You can send specific headers from your web application to prevent the browser from caching. You might send these headers only to your ip or browsers where a certain cookie is set.
Return these headers to prevent a browser from caching your content:
Cache-Control: no-cache, must-revalidate
Expires: Sat, 26 Jul 1997 05:00:00 GMT
Expires should be a date in the past.
The Charles Web Debugging Proxy is a good way to disable cache for specific domains. Just go to the Tools menu, then select No Caching. A window will open that lets you specify which locations to prevent caching on.
Charles is a proxy so you can use it to control the caching in all of your web browsers - Firefox, Chrome, IE, whatever you use!
I usually use a rewrite rule allowing /static/${NUMBERS}/directory/file.js to be served from /static/directory/file.js. Large files are treated separately (mp4, zip...). With PHP i set ${NUMBERS} to the UNIX_TIMESTAMP for development, and to VERSION_NUMBER for production. Now development is always downloaded but cdnjs almost never.