MVC's HTTP Caching - Last-Modified response header always equals Date - asp.net-mvc

I'm not fully understanding how .NET MVC's HTTP caching works because it doesn't seem like it's actually retrieving cached resource files. I'm thinking I need to add some additional code somewhere...
First, let's take a look at how I've set up HTTP caching on static content (ie. images). In my web.config, I have the following:
<system.webServer>
<staticContent>
<clientCache cacheControlMode="UseExpires" httpExpires="Tue, 19 Jan 2038 03:14:07 GMT" />
</staticContent>
</system.webServer>
This results in the images in my application to appear to be caching properly. When I look at the response headers for an image, I see this (removed unnecessary headers):
Date:Thu, 27 Feb 2014 16:27:48 GMT
ETag:"086f8d199a4ce1:0"
Expires:Tue, 19 Jan 2038 03:14:07 GMT
Last-Modified:Thu, 29 Aug 2013 09:26:20 GMT
I'm seeing an ETag value which is good and my Expires is what it should be. Additionally, the Last-Modified date is in the past. I understand the Last-Modified date to be the date the server was last asked for that file.
Now let's look at the response headers for a javascript file that has been optimized by MVC. As a reminder, this article states that "Bundles set the HTTP Expires Header one year from when the bundle is created."
Cache-Control:public
Date:Thu, 27 Feb 2014 16:44:16 GMT
Expires:Fri, 27 Feb 2015 16:44:16 GMT
Last-Modified:Thu, 27 Feb 2014 16:44:16 GMT
Vary:User-Agent
The Response Headers for the MVC cached file is missing the ETag for one. There is a Cache-Control value of "public" which wasn't present on the static content response header. Lastly, the Expires is 1 year after the Last-Modified date which is correct, but the Last-Modified date is always the same as the Date value. These response headers to me seem like what they'd be when a resource is requested from the server for the first time and cached, not when it's been subsequently requested and retrieved from cache.
Thanks in advance for any insight.
UPDATE: It actually seems to be caching in IE. The Last-Modified date on subsequent requests remains a value in the past. I'm not seeing this in FF or Chrome, though. I confirmed that in both of those browsers, I haven't disabled caching. What gives?

First of all, system.webServer/staticContent/clientCache is for static resources (files) only. So it works if you directly access pictures.
A bundle is a dynamic resource (a handler is generating the content). That is why the configuration directive does not apply.
Secondly, using ETag, Expires, Last-Modified are three different caching techniques. You should not use both techniques together as they work in a different way.
Expires tells the browser to keep the file in cache until a specified date. The browser will never call the server until that date.
ETag is a dynamic cache mechanism. The browser will always call the server but the server may not respond with the content if it has not changed.
Last-Modified is an ancient dynamic cache mechanism. It work the same way as ETag but is has a problematic requirement of exposing the correct modification date. Which is not a easy thing to find when creating dynamic content.
I think that combining multiple techniques should be reserved for well thought cases only. See this answer for details.
You may want to read the basics: HTTP caching
And then for your issue, I am not sure why so many header are exposed by your app. The presence of Last-Modified on a bundle is strange to explain. A bundle is a virtual thing that exposes various files combined as one. So it has no real date of last modification. Are you using extra caching code/modules?

Related

Rails default behavior: Why is Rails not generating different Etags by default or returning 304 Not Modified?

I am inspecting a Rails response. No http headers have been purposefully set to be returned from the server.
The response includes Cache-Control: max-age=0, private, must-revalidate and Etag. The response is a 200 no matter how many times I make the same request or within how short a period of time.
My questions are:
How is the Etag being generated if I didn't set a stale? or
fresh_when in the response? The Etag is always the same for this
endpoint.
Regardless of how the Etag is being generated, if it is always the same, why isn't the server noticing that the Etag is the always
the same and responding with a 304 rather than a 200?
Thank you
One of the reason might be the presence of mini_profiler, as by default that gem overrides some cache related headers (see here why and how to fix it).
Another one is that you have config.action_controller.perform_caching = false.

Is there a way to get Indy to accept expired cookies?

Using Delphi XE2 Update 4 and Indy components 10.5.8.0 as installed by Delphi.
A website I am trying to work with is deliberately sending me a cookie with an already expired date for security reasons. It is deliberate on their part and I have to live with it and other developers working with them have managed to do so.
The server response to my IdHTTP.Get() looks like this:
Recv 8/30/2012 3:06:15 PM: HTTP/1.1 200 OK<EOL>
Date: Thu, 30 Aug 2012 19:06:35 GMT<EOL>
Server: Apache/2.2.3 (CentOS)<EOL>
X-Powered-By: PHP/5.1.6<EOL>
Set-Cookie: dati=eJxLtDKxqi62MrdSykyJLy1OLVKyLrYytVIyNjcxtISyc%2FLTM%2FNAbKAqt4CgYJfUMhDPEqinOB4omZ6aAuIbWilFgmkDkGGpeSWZaZkQ8wyNrZTSchKLMwryM%2FNK4ouKlaxrAYdvJUQ%3D;
expires=Thu, 01-Jan-1970 00:03:00 GMT<EOL>
Content-Length: 16<EOL>
Connection: close<EOL>
Content-Type: text/plain;; charset=ISO-8859-1<EOL><EOL>
0 24.141.251.145
Note the 1970 expiry! I have been using Indy for all of 3 days now and as near as I can tell this attempt to create a cookie is simply ignored by IdHttp and the associated CookieManager.
Assuming I need to receive and use the cookie anyway what would be the proper approach to capture it, or should I just run away screaming now? I have a lot of Delphi experience but this is my first foray into the wonderful world of internet connectivity and this expired-on-arrival cookie business is leaving me scratching my head!
Sending an expired cookie is how a webserver deletes a cookie on the client end. Indy's cookie manager supports that. When it receives a cookie, it will delete any existing cookie of the same name/domain/path trio and will then discard the new cookie if it is already expired. This is hard-coded behavior.
If, for whatever reason, you have to keep the expired cookie (which does not make sense as no webclient should ever do that), you will have to use the TIdCookieManager.OnNewCookie event to alter the expiration of the cookie after it is parsed by TIdCookie but before it is processed by TIdCookieManager. You can set the TIdCookie.Expires property to a future date, or to 0.0 to disable its expiration. You will have to do this anyway, as TIdCookieManager implements cleanup of expired cookies (which is triggered whenever TIdHTTP asks for cookies to send in an HTTP request), so if you don't alter the expiration then the cookie will just get discarded anyway at a later time.

Unable to completely force browsers to cache resources - gsCacheBuster comes in

Proper caching is very important for us as we aim to reduce traffic.
I've configured communication for resources as so:
This is the request URL:
http://www.mydomain.com/resources/product/assets/swf/modules/loader.swf?ver=1
These are the response headers:
Accept-Ranges:bytes
Connection:Keep-Alive
Content-Length:14622
Content-Type:application/x-shockwave-flash
Date:Tue, 22 May 2012 09:16:41 GMT
ETag:"7804f-391e-4c08e046d3ec0"
Keep-Alive:timeout=15, max=96
Last-Modified:Mon, 21 May 2012 16:01:39 GMT
P3P:CP="HONK"
Server:Apache
I need to force the browsers to cache the resources I pass back. As far as I know these headers should be sufficient to cache resources. But...
For some resources a strange thing, called gsCacheBuster, comes in:
Request URL:
http://www.mydomain.com/resources/product/assets/images/image1.png
?gsCacheBusterID=1337684498911&purpose=audit&ver=1
Which changes the URL and the resource is therefore not cached.
As you can see we've implemented functionality in the client (which is Flash in the case) to pass a ver parameter along the request to make sure the current version is being downloaded. This way we aim to maintain versioning of resources and force browsers to cache resources with the same version.
I have no idea why this cacheBuster appears and couldn't find any info. It happens in all browsers.
It turns out it is a LoaderMax property, since we use the Greensock library in our Flash client. It can be disabled with setting the setter noCache to false and also remove audits.

Why/how do browsers know to cache content (html,css,js,etc) when not explicitly instructed to do so

I was looking at Chirpy for css/js minifying,compression, etc.
I noticed it doesn't support caching. It doesn't have any logic for sending expires headers, etags, etc.
The absence of this feature made me question if caching content is not as much of a concern; YSlow! grades this so I'm a little confused. Now I'm researching caching and cannot explain why this css file, SuperFish.css, is being retrieved from cache.
Visit http://www.weirdlover.com (developer of Chirpy)
Look at initial network track. Notice, there is no expiration header for SuperFish.css.
Revisit the page and inspect the network trace again. Now SuperFish.css is retrieved from cache.
Why is the SuperFish.css retrieved from cache upon revisiting the page? This happens even when I close all instances of chrome and then revisit the page.
This seems to fall with in the HTTP specification.
13.4 Response Cacheability
Unless specifically constrained by a cache-control (section 14.9) directive, a caching system MAY always store a successful response (see section 13.8) as a cache entry, MAY return it without validation if it is fresh
13.2.2 Heuristic Expiration
Since origin servers do not always provide explicit expiration times, HTTP caches typically assign heuristic expiration times, employing algorithms that use other header values (such as the Last-Modified time) to estimate a plausible expiration time.
It would seem by not providing a cache-control header, and leaving out the expires header the client is free to use a heuristic to generate an expiry date and then caches the response based upon that.
The presence of an etag has no effect on this since the etag is used to re-validate an expired cache entry, and in this case chrome considers the cached entry to be fresh (the same applies to last-modified), thus it hasn't yet expired.
The general principle being if the origin server is concerned with freshness it should explicitly state it.
In this case (when server doesn't return Expires header), the browser should make HTTP request with If-Modified-Since header, and if the server returns HTTP 304 Not modified then the browser gets the data from the cache.
But, I see, nowadays browsers don't do any requests when the data is in the cache. I think they behave this way for better response time.

MVC Framework Browser Caching Issue with RC1

In my latest project which is in RC1 I have noticed that I have this browser caching issue that I just can't shake. This is what my header looks like
HTTP/1.1 200 OK
Date: Tue, 03 Mar 2009 15:11:34 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
X-AspNet-Version: 2.0.50727
X-AspNetMvc-Version: 1.0
Cache-Control: private
Expires: Tue, 03 Mar 2009 15:11:34 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 4614
Now technically if this is private it shouldn't have an expiration date on the content right? I've tried no-cache as well with the same results. Anybody have any experience with this particular issue?
Cache-Control: private only specifies that the response is only intended for a single user and should not be stored in a shared cache (say, in a proxy) and used to serve requests for other users. I don't see anything in the protocol documentation that would preclude the use of an Expires header with a value. In fact, it seems a perfectly reasonable thing to say "use this for subsequent requests for this user only, but not after this time." There are other values for Cache-Control where Expires may not make sense, but I believe that the protocol has a means for disambiguating between conflicting headers (See section 4 of the protocol docs).
Quoting from Section 16.2 of the HTTP 1.1 protocol docs:
private
Indicates that all or part of the response message is intended for
a single user and MUST NOT be cached by a shared cache. This
allows an origin server to state that the specified parts of the
response are intended for only one user and are not a valid
response for requests by other users. A private (non-shared)
cache MAY cache the response.
Note: This usage of the word private only controls where the
response may be cached, and cannot ensure the privacy of the
message content.
There's no reason why private content can't be cached, its just that it should only be cached by the browser in the current users context, it should not be cached server side or by other caches such as a proxy server.

Resources