some questions with asp.net mvc output caching - asp.net-mvc

I am working with asp.net mvc 4 output caching and for some reason I cant get it working. What I am looking to do is standard but the output caching never kicks in. I think it may be due to cookies being changed and therefore not caching. I want output cache on the client. All my site is in https and all requests is under https.
Some questions I have:
how can I set up output caching to ignore any changes in cookies?
what should I look for in the response headers to understand if output caching is working or not?
setting the location to ServerAndClient - am i correct in saying this results in caching on the server and in the clients browser?
Thanks.
Further info:
What I have noticed via fiddler is that the http header request contains Cache-Control: max-age=0 but I have no idea how or why its being set to this.

by default caching attribute ignores the cookies!
you can set breakpoints in your controllers to see if output caching works or not, seems you know where to look! "... contains Cache-Control: max-age=0 but ..."
yes it is correct
and i suggest you to set Duration on the caching filter
look in the web.config file to see if the cacheControlMode attribute is not set to something strange! and if it is s-maxage=0 it is because of shared proxy
set the attribute like this on the controllers that you want to cache
[OutputCache( Duration = 360 )]

Related

SPA - Should servers http cache be turned off for all pwa related resources?

Should any of PWA related resources be served with any kind of cache headers from server or should we move classic http caching out of our way by turning it completely off?
Namely, what should be http cache headers for:
manifest file
Related to it, how does new versions of manifest file (favicon changed for example) get to the client?
service worker js file
(this one is a bit tricky because browsers check for new versions every 24 hours so some caching might be good?)
index.html (entry point for spa)
My understanding was that it should be turned off completely and all the cache should be handled from service worker but there seems to be different infos out there and hard to extract best practices.
There's some guidance at https://web.dev/reliable/http-cache, along with a number of other resources on the web.
In general, building a PWA and introducing a service worker doesn't change the best practices that you should follow for your HTTP caching.
For assets that include versioning information in their URL (like /v1.0.0/app.js, or /app.1234abcd.js), and you know that the contents of a given URL won't even change, you should use Cache-Control: max-age=31536000.
For assets that don't include versioning information in their URL (like most HTML documents, and also /manifest.json, if you don't include a hash there), you should set Cache-Control: no-cache along with ETag or Last-Modified, to ensure that a previously cached response is revalidated before being used.
For your service worker file itself, modern browsers will ignore the Cache-Control header value you set by default, so it doesn't really matter. But it's still a best practice to use Cache-Control: no-cache so that older browsers will revalidate it before using it.

MVC 4 Disable Client Image Caching

I am working on an MVC 4 app designed to run on iOS. I have encountered a problem where the app crashes when the local cache exceeds 5MB (due to a very high number of images on the site).
I am trying to disable local caching, I have tried meta tags suggested in other posts and this does not work. I have also tried decorating controller actions with
[OutputCache(Duration = 1, Location = OutputCacheLocation.None)]
This doesn't work because we use partial views and I get an exception saying that the location parameter is not supported on partial views.
Any advice?
Have you tried defining response headers? More specifically, the following header:
Cache-Control - This header must be present in the response from the server to enable HTTP caching by a client. The value of this header may include information like its max-age (how long to cache a response), and whether the response may be cached with public or private access, or no-cache (not at all). See the Cache-Control section of RFC 2616 for full details.
We ended up fixing this bug by using CSS Background Images rather than tags.

Possible to disable rack-cache on a per-request basis in Rails 3.2?

I have been attempting to get streaming working in Rails 3.2 on Heroku (see my SO post here: Rails 3.2 streaming).
I am coming to the conclusion that rack-cache is causing the problem. Disabling it using config.middleware.delete(Rack::Cache) in production.rb seems to fix it. This, obviously, disables it for my entire app.
I only want it disabled for the one streaming request (which is on the admin side and will be used infrequently). Is this possible? It'd be a major bummer to lose caching for the sake of one small (but required) admin feature.
Thanks very much!!!
Edit: I have attempted setting the headers to not cache the action in question, but Rack::Cache is still causing the streaming to fail. Totally disabling it is the only solution I have found so far.
I ended up not needing to disable Rack-cache. Just needed to add this self.response.headers['Last-Modified'] = Time.now.ctime.to_s
to my response.
While you can't disable it, you might not need to; you may just need to bypass the caching mechanism.
Per the source here and here, if the Cache-Control: no-cache header or the Pragma: no-cache headers are set, Rack::Cache won't attempt to pull a request from the cache. That doesn't disable it, but it does let you ensure that you don't have a request that shouldn't be cached end up returning a caching response.
Additionally, you can ensure that Rack::Cache never caches a response for a given action with something like:
response.headers['Cache-Control'] = 'private,max-age=0,must-revalidate,no-store'
in your controller action. This will ensure that Rack::Cache (and any other upstream proxies) don't cache the response, resulting in an always-fresh hit to your backend.
If this fails, then you're likely having issues due to the forward method in context.rb. There doesn't seem to be a way to bypass it, so you'd probably want to patch Rack::Cache to just invoke #call if a certain header is set.

'Vary: If-None-Match' to cache mobile and desktop requests separately

Note: Please correct me if any of my assumptions are wrong. I'm not very sure of any of this...
I have been playing around with HTTP caching on Heroku and trying to work out
a nice way to differentiate between mobile and desktop requests when caching using Varnish
on Heroku.
My first idea was that I could set a Vary header so the cache is Varied on If-None-Match. As Rails automatically sends back etags generated from a hash of the content the etag would vary between desktop and mobile requests (different templates) and so it would eventually cache two versions (not fact, just my original thoughts). I have been playing around with this but I don't think it works.
Firstly, I can't wrap my head around when/if anything gets cached as surely requests with If-None-Match will be conditional gets anyway? Secondly, in practice fresh requests (ones without If-None-Match) sometimes receive the mobile site. Is this because the cache doesn't know whether to serve up the mobile or desktop cached version as the If-None-Match header isn't there?
As it probably sounds, I am rather confused. Will this approach work in any way or am I being silly? Also, is there anyway to achieve separate cached versions if I am unable to reach the Varnish config at all (as I am on Heroku)?
The exact code I am using in Rails to set the cache headers is:
response.headers['Cache-Control'] = 'public, max-age=86400'
response.headers['Vary'] = 'If-None-Match'
Edit: I am aware I can use Vary: User-Agent but trying to avoid it if possible due to it have a high miss rate (many, many user agents).
You could try Vary: User-Agent. However you'll have many cached versions of a single page (one for each user agent).
An other solution may be to detect mobile browsers directly in the reverse proxy, set a X-Is-Mobile-Browser client header before the reverse proxy attempts to find a cached page, set a Vary: X-Is-Mobile-Browser on the backend server (so that the reverse proxy will only cache 2 versions of the same page) and replace that header with Vary: User-Agent before sending to client.
If you can not change your varnish configuration, you have to make different urls for mobile and desktop pages. You can add some url-parameter (?mobile=true), add a piece in your path (yourdomain.com/mobile/news) or use a different host (like m.yourdomain.com).
This makes a lot of sense because (I've seen this many times, both in CMSs and applications) at some point in time you want to differentiate content and structure for mobile devices. People just do different things or are looking for different information on mobile devices...

YSlow recommendations. In IIS, why wouldn't Enable Content Expiration be checked by default in

i just ran yslow against my website and i had a question around Expiry Headers: YSlow gave me an : Grade F on Add Expires headers. There are 20 static components without a far-future expiration date. These are all css or js files.
Right now when i go to IIS (6.0), and go to the http headers tab, Enable Content Expiration is NOT checked. from reading this it seems like this is the right thing to do as the browser will then cache the content. So i am confused why yslow is complaining. Also, it sounds like browsers will cache this data by modified date anyway so is this whole thing meaningless ??
So if setting this is a no brainer, why isn't this the default behavior??
Can someone please clarify.
There's no contradiction here. What you need to do is set content expiration on the folders which contain static content. Such as your image, css, and script folders. You can set the content expiration on a folder-basis in IIS and other web servers.
A browser has no idea what content is 'static' or not; it literally has no way of knowing, and yslow is only guessing, most likely. It's probably guessing correctly... but having incorrect Expires values by default in the web server could cause browsers to cache dynamic content that you do not want them caching at all.
That's why it's not set like that by default.

Resources