I am working on an MVC 4 app designed to run on iOS. I have encountered a problem where the app crashes when the local cache exceeds 5MB (due to a very high number of images on the site).
I am trying to disable local caching, I have tried meta tags suggested in other posts and this does not work. I have also tried decorating controller actions with
[OutputCache(Duration = 1, Location = OutputCacheLocation.None)]
This doesn't work because we use partial views and I get an exception saying that the location parameter is not supported on partial views.
Any advice?
Have you tried defining response headers? More specifically, the following header:
Cache-Control - This header must be present in the response from the server to enable HTTP caching by a client. The value of this header may include information like its max-age (how long to cache a response), and whether the response may be cached with public or private access, or no-cache (not at all). See the Cache-Control section of RFC 2616 for full details.
We ended up fixing this bug by using CSS Background Images rather than tags.
Related
I am configuring gzip compression for a server via Jetty, and there are PUT/POST endpoints whose response payloads I would like to compress. The default GzipHandler configuration for Jetty specifically only includes GET; that this is the default is documented, but I'm unable to find documentation as to why this is the default. Is there a downside to applying gzip when the method is non-GET?
The reason comes down to responses from PUT and POST are, in a general sense, not suitable for being put into a cache.
GET was selected as the default back when gzip compression was first introduced, (and back before Jetty moved to Eclipse, back before Servlet 2.0, back when it was called the GzipFilter in Jetty) and in that era if the content couldn't be cached it wasn't compressed.
Why?
Well, back then using system resources to compress the same content over and over and over again was seen as a negative, it was more important to serve many requests vs a few optimized requests.
The GzipHandler can be configured to use any method, even nonsensical ones like HEAD.
Don't let a default value with historical reasons prevent you from using the GzipHandler, use it, configure it, and be happy.
Feel free to file an issue requesting the defaults be changed at https://github.com/eclipse/jetty.project/issues
Should any of PWA related resources be served with any kind of cache headers from server or should we move classic http caching out of our way by turning it completely off?
Namely, what should be http cache headers for:
manifest file
Related to it, how does new versions of manifest file (favicon changed for example) get to the client?
service worker js file
(this one is a bit tricky because browsers check for new versions every 24 hours so some caching might be good?)
index.html (entry point for spa)
My understanding was that it should be turned off completely and all the cache should be handled from service worker but there seems to be different infos out there and hard to extract best practices.
There's some guidance at https://web.dev/reliable/http-cache, along with a number of other resources on the web.
In general, building a PWA and introducing a service worker doesn't change the best practices that you should follow for your HTTP caching.
For assets that include versioning information in their URL (like /v1.0.0/app.js, or /app.1234abcd.js), and you know that the contents of a given URL won't even change, you should use Cache-Control: max-age=31536000.
For assets that don't include versioning information in their URL (like most HTML documents, and also /manifest.json, if you don't include a hash there), you should set Cache-Control: no-cache along with ETag or Last-Modified, to ensure that a previously cached response is revalidated before being used.
For your service worker file itself, modern browsers will ignore the Cache-Control header value you set by default, so it doesn't really matter. But it's still a best practice to use Cache-Control: no-cache so that older browsers will revalidate it before using it.
I am working with asp.net mvc 4 output caching and for some reason I cant get it working. What I am looking to do is standard but the output caching never kicks in. I think it may be due to cookies being changed and therefore not caching. I want output cache on the client. All my site is in https and all requests is under https.
Some questions I have:
how can I set up output caching to ignore any changes in cookies?
what should I look for in the response headers to understand if output caching is working or not?
setting the location to ServerAndClient - am i correct in saying this results in caching on the server and in the clients browser?
Thanks.
Further info:
What I have noticed via fiddler is that the http header request contains Cache-Control: max-age=0 but I have no idea how or why its being set to this.
by default caching attribute ignores the cookies!
you can set breakpoints in your controllers to see if output caching works or not, seems you know where to look! "... contains Cache-Control: max-age=0 but ..."
yes it is correct
and i suggest you to set Duration on the caching filter
look in the web.config file to see if the cacheControlMode attribute is not set to something strange! and if it is s-maxage=0 it is because of shared proxy
set the attribute like this on the controllers that you want to cache
[OutputCache( Duration = 360 )]
I am optimization my web page by implementing caching, so if I want the browser not to take data from cache, then I will append a dynamic number as query value.
eg: google.com?val=823746
But some time, if I want to bring data from cache for the below url, the browser is making a new http request to server, its not taking data from cache. Is that because of the question mark in URL ?
eg: http://google.com?
Please provide some reference document link.
Thanks in advance.
Regards,
Navin
Use appropriate HTTP headers.
Search of pragma: no-cache and Expires
Browsers may not cache URLs that contain a query string (part after ? ) unless the headers indicate the expiry time explicitly.
Cache policy is not same across all browsers. If you don't specify appropriate headers the results may be even more unpredictable.
Since query strings are used with dynamically generated pages, the browser may take that hint and fire a new request even if the query string is same.
For example, a desktop browser may err on side of caution and fire a new request. On the other hand a mobile browser with aggressive cache policy may pull the page from cache.
Note: Please correct me if any of my assumptions are wrong. I'm not very sure of any of this...
I have been playing around with HTTP caching on Heroku and trying to work out
a nice way to differentiate between mobile and desktop requests when caching using Varnish
on Heroku.
My first idea was that I could set a Vary header so the cache is Varied on If-None-Match. As Rails automatically sends back etags generated from a hash of the content the etag would vary between desktop and mobile requests (different templates) and so it would eventually cache two versions (not fact, just my original thoughts). I have been playing around with this but I don't think it works.
Firstly, I can't wrap my head around when/if anything gets cached as surely requests with If-None-Match will be conditional gets anyway? Secondly, in practice fresh requests (ones without If-None-Match) sometimes receive the mobile site. Is this because the cache doesn't know whether to serve up the mobile or desktop cached version as the If-None-Match header isn't there?
As it probably sounds, I am rather confused. Will this approach work in any way or am I being silly? Also, is there anyway to achieve separate cached versions if I am unable to reach the Varnish config at all (as I am on Heroku)?
The exact code I am using in Rails to set the cache headers is:
response.headers['Cache-Control'] = 'public, max-age=86400'
response.headers['Vary'] = 'If-None-Match'
Edit: I am aware I can use Vary: User-Agent but trying to avoid it if possible due to it have a high miss rate (many, many user agents).
You could try Vary: User-Agent. However you'll have many cached versions of a single page (one for each user agent).
An other solution may be to detect mobile browsers directly in the reverse proxy, set a X-Is-Mobile-Browser client header before the reverse proxy attempts to find a cached page, set a Vary: X-Is-Mobile-Browser on the backend server (so that the reverse proxy will only cache 2 versions of the same page) and replace that header with Vary: User-Agent before sending to client.
If you can not change your varnish configuration, you have to make different urls for mobile and desktop pages. You can add some url-parameter (?mobile=true), add a piece in your path (yourdomain.com/mobile/news) or use a different host (like m.yourdomain.com).
This makes a lot of sense because (I've seen this many times, both in CMSs and applications) at some point in time you want to differentiate content and structure for mobile devices. People just do different things or are looking for different information on mobile devices...