Say I have a service worker that populates the cache with the following working code when its activated:
async function install() {
console.debug("SW: Installing ...");
const cache = await caches.open(CACHE_VERSION);
await cache.addAll(CACHE_ASSETS);
console.log("SW: Installed");
}
async function handleInstall(event) {
event.waitUntil(install());
}
self.addEventListener("install", handleInstall);
When performs cache.addAll(), will the browser use its own internal cache, or will it always download the content from the site? This is important, because it one creates a new service worker release, and there are new static assets, the old version maybe be cached by the service worker.
If not then I guess one still has to do named hashed/versioned static assets. Something I was hoping service workers would make none applicable.
cache.addAll()'s behavior is described in the service worker specification, but here's a more concise summary:
For each item in the parameter array, if it's a string and not a Request, construct a new Request using that string as input.
Perform fetch() on each request and get a response.
As long as the response has an ok status, call cache.put() to add the response to the cache, using the request as the key.
To answer your question, the most relevant step is 1., as that determines what kind of Request is passed to fetch(). If you just pass in a string, then there are a lot of defaults that will be used when implicitly constructing the Request. If you want more control over what's fetch()ed, then you should explicitly create a Request yourself and pass that to cache.addAll() instead of passing in strings.
For instance, this is how you'd explicitly set the cache mode on all the requests to 'reload', which always skip the browser's normal HTTP cache and go against the network for a response:
// Define your list of URLs somewhere...
const URLS = ['/one.css', '/two.js', '/three.js', '...'];
// Later...
const requests = URLS.map((url) => new Request(url, {cache: 'reload'}));
await cache.addAll(requests);
When a browser requests an image from the server, the call is getting picked up by an API controller in the back end. There, a authorization check must be done before returning the image in order to check if the request is allowed or not.
So I need to add the authorization header and when searching for the best solution, I found this article: https://www.twelve21.io/how-to-access-images-securely-with-oauth-2-0/ and I was mostly intereseted in the solution number 4 which uses a Service Worker.
I made my own implementation, I registered a serviceWorker:
if ('serviceWorker' in navigator) {
console.log("serviceWorker active");
window.addEventListener('load', onLoad);
}
else {
console.log("serviceWorker not active");
}
function onLoad() {
console.log("onLoad is called");
var scope = {
scope: '/api/imagesgateway/'
};
navigator.serviceWorker.register('/Scripts/ServiceWorker/imageInterceptor.js', scope)
.then(registration => console.log("ServiceWorker registration successful with scope: ", registration.scope))
.catch(error => console.error("ServiceWorker registration failed: ", error));
}
and this is in my imageInterceptor:
self.addEventListener('fetch', event => {
console.log("fetch event triggered");
event.respondWith(
fetch(event.request, {
mode: 'cors',
credentials: 'include',
header: {
'Authorization': 'Bearer ...'
}
})
)
});
When I run my application, I see in my console that the registration seems to be successfully executed as I see the console.logs printed (ServiceWorker active, onLoad is called and successful registration with correct scope: https://localhost:44332/api/imagesgateway/
But when I load an image (https://localhost:44332/api/imagesgateway/...) via the gateway, I still get a 400 and when put a breakpoint on the backend I see that the authentication header is still null. Also, I don't see "fetch event triggered" message in my console. In another article it is stated that I can see the registered service workers via this setting: chrome://inspect/#service-workers but I don't see my worker there either.
My question is: Why isn't the authorization header added? Is it because, although the registration seems to go successfully, this isn't actually the case and therefore I don't see the worker in inspect#service-workers either?
You're not seeing fetch event triggered in the browser console because your Service Worker script isn't allowed to intercept the image requests. This is because your Service Worker script is located in a directory outside the scope of the requests you're interested in.
In order to intercept requests that handle resources at
/api/imagesgateway/
the SW script needs to be located in either
/, /api/, or /api/imagesgateway/. It cannot be located in /some/other/directory/service-worker.js.
This is the reason that your Service Worker registers successfully! There is no probelm in registering the SW. The problem lies in what it can do.
More info: Understanding Service Worker scope
I'm working on testing an OpenID Connect service, using Code and Implicit Flow. I would really like to be able to access the messages I get back from the service, especially the 303 See Other message which has the ID Token.
If someone can advise on how to get to response messages I would really appreciate it. Since the services exposes a HTML login page what happens is a
cy.get("#loginButton").click()
so I don't send a cy.request() and that is because I want to test login using the front-end.
You should leverage cy.route, how it works:
before cy.visit you need to add cy.server(), it allows Cypress to intercept every request
you add an alias to the login request
cy.route({
method: "POST",
url: '/auth/token' // this is just an example, replace it with a part of the real URL called to log in the user
}).as("route_login"); // that's the alias, we'll use in soon
right after the cy.get("#loginButton").click() command, you can wait for the login request to happen
cy.wait("#route_login").then(xhr => {
// you can read the full response from `xhr.response.body`
cy.log(JSON.stringify(xhr.response.body));
});
your final test should be something like
it("Test description", () => {
cy.server();
cy.visit("YOUR_PAGE_URL");
cy.route({
method: "POST",
url: '/auth/token'
}).as("route_login");
cy.get("#loginButton").click();
cy.wait("#route_login").then(xhr => {
// you can read the full response from `xhr.response.body`
cy.log(JSON.stringify(xhr.response.body));
});
});
Let me know if you need more help 😉
cy.server() and cy.route() are deprecated in Cypress 6.x
Use cy.intercept() instead:
cy.intercept('POST', '/organization', (req) => {
expect(req.body).to.include('Acme Company')
})
Your tests can intercept, modify and wait on any type of HTTP request originating from your app.
Docs: https://docs.cypress.io/api/commands/intercept.html (with examples)
I am new to Service Workers, and have had a look through the various bits of documentation (Google, Mozilla, serviceworke.rs, Github, StackOverflow questions). The most helpful is the ServiceWorkers cookbook.
Most of the documentation seems to point to caching entire pages so that the app works completely offline, or redirecting the user to an offline page until the browser can redirect to the internet.
What I want to do, however, is store my form data locally so my web app can upload it to the server when the user's connection is restored. Which "recipe" should I use? I think it is Request Deferrer. Do I need anything else to ensure that Request Deferrer will work (apart from the service worker detector script in my web page)? Any hints and tips much appreciated.
Console errors
The Request Deferrer recipe and code doesn't seem to work on its own as it doesn't include file caching. I have added some caching for the service worker library files, but I am still getting this error when I submit the form while offline:
Console: {"lineNumber":0,"message":
"The FetchEvent for [the form URL] resulted in a network error response:
the promise was rejected.","message_level":2,"sourceIdentifier":1,"sourceURL":""}
My Service Worker
/* eslint-env es6 */
/* eslint no-unused-vars: 0 */
/* global importScripts, ServiceWorkerWare, localforage */
importScripts('/js/lib/ServiceWorkerWare.js');
importScripts('/js/lib/localforage.js');
//Determine the root for the routes. I.e, if the Service Worker URL is http://example.com/path/to/sw.js, then the root is http://example.com/path/to/
var root = (function() {
var tokens = (self.location + '').split('/');
tokens[tokens.length - 1] = '';
return tokens.join('/');
})();
//By using Mozilla’s ServiceWorkerWare we can quickly setup some routes for a virtual server. It is convenient you review the virtual server recipe before seeing this.
var worker = new ServiceWorkerWare();
//So here is the idea. We will check if we are online or not. In case we are not online, enqueue the request and provide a fake response.
//Else, flush the queue and let the new request to reach the network.
//This function factory does exactly that.
function tryOrFallback(fakeResponse) {
//Return a handler that…
return function(req, res) {
//If offline, enqueue and answer with the fake response.
if (!navigator.onLine) {
console.log('No network availability, enqueuing');
return enqueue(req).then(function() {
//As the fake response will be reused but Response objects are one use only, we need to clone it each time we use it.
return fakeResponse.clone();
});
}
//If online, flush the queue and answer from network.
console.log('Network available! Flushing queue.');
return flushQueue().then(function() {
return fetch(req);
});
};
}
//A fake response with a joke for when there is no connection. A real implementation could have cached the last collection of updates and keep a local model. For simplicity, not implemented here.
worker.get(root + 'api/updates?*', tryOrFallback(new Response(
JSON.stringify([{
text: 'You are offline.',
author: 'Oxford Brookes University',
id: 1,
isSticky: true
}]),
{ headers: { 'Content-Type': 'application/json' } }
)));
//For deletion, let’s simulate that all went OK. Notice we are omitting the body of the response. Trying to add a body with a 204, deleted, as status throws an error.
worker.delete(root + 'api/updates/:id?*', tryOrFallback(new Response({
status: 204
})));
//Creation is another story. We can not reach the server so we can not get the id for the new updates.
//No problem, just say we accept the creation and we will process it later, as soon as we recover connectivity.
worker.post(root + 'api/updates?*', tryOrFallback(new Response(null, {
status: 202
})));
//Start the service worker.
worker.init();
//By using Mozilla’s localforage db wrapper, we can count on a fast setup for a versatile key-value database. We use it to store queue of deferred requests.
//Enqueue consists of adding a request to the list. Due to the limitations of IndexedDB, Request and Response objects can not be saved so we need an alternative representations.
//This is why we call to serialize().`
function enqueue(request) {
return serialize(request).then(function(serialized) {
localforage.getItem('queue').then(function(queue) {
/* eslint no-param-reassign: 0 */
queue = queue || [];
queue.push(serialized);
return localforage.setItem('queue', queue).then(function() {
console.log(serialized.method, serialized.url, 'enqueued!');
});
});
});
}
//Flush is a little more complicated. It consists of getting the elements of the queue in order and sending each one, keeping track of not yet sent request.
//Before sending a request we need to recreate it from the alternative representation stored in IndexedDB.
function flushQueue() {
//Get the queue
return localforage.getItem('queue').then(function(queue) {
/* eslint no-param-reassign: 0 */
queue = queue || [];
//If empty, nothing to do!
if (!queue.length) {
return Promise.resolve();
}
//Else, send the requests in order…
console.log('Sending ', queue.length, ' requests...');
return sendInOrder(queue).then(function() {
//Requires error handling. Actually, this is assuming all the requests in queue are a success when reaching the Network.
// So it should empty the queue step by step, only popping from the queue if the request completes with success.
return localforage.setItem('queue', []);
});
});
}
//Send the requests inside the queue in order. Waiting for the current before sending the next one.
function sendInOrder(requests) {
//The reduce() chains one promise per serialized request, not allowing to progress to the next one until completing the current.
var sending = requests.reduce(function(prevPromise, serialized) {
console.log('Sending', serialized.method, serialized.url);
return prevPromise.then(function() {
return deserialize(serialized).then(function(request) {
return fetch(request);
});
});
}, Promise.resolve());
return sending;
}
//Serialize is a little bit convolved due to headers is not a simple object.
function serialize(request) {
var headers = {};
//for(... of ...) is ES6 notation but current browsers supporting SW, support this notation as well and this is the only way of retrieving all the headers.
for (var entry of request.headers.entries()) {
headers[entry[0]] = entry[1];
}
var serialized = {
url: request.url,
headers: headers,
method: request.method,
mode: request.mode,
credentials: request.credentials,
cache: request.cache,
redirect: request.redirect,
referrer: request.referrer
};
//Only if method is not GET or HEAD is the request allowed to have body.
if (request.method !== 'GET' && request.method !== 'HEAD') {
return request.clone().text().then(function(body) {
serialized.body = body;
return Promise.resolve(serialized);
});
}
return Promise.resolve(serialized);
}
//Compared, deserialize is pretty simple.
function deserialize(data) {
return Promise.resolve(new Request(data.url, data));
}
var CACHE = 'cache-only';
// On install, cache some resources.
self.addEventListener('install', function(evt) {
console.log('The service worker is being installed.');
// Ask the service worker to keep installing until the returning promise
// resolves.
evt.waitUntil(precache());
});
// On fetch, use cache only strategy.
self.addEventListener('fetch', function(evt) {
console.log('The service worker is serving the asset.');
evt.respondWith(fromCache(evt.request));
});
// Open a cache and use `addAll()` with an array of assets to add all of them
// to the cache. Return a promise resolving when all the assets are added.
function precache() {
return caches.open(CACHE).then(function (cache) {
return cache.addAll([
'/js/lib/ServiceWorkerWare.js',
'/js/lib/localforage.js',
'/js/settings.js'
]);
});
}
// Open the cache where the assets were stored and search for the requested
// resource. Notice that in case of no matching, the promise still resolves
// but it does with `undefined` as value.
function fromCache(request) {
return caches.open(CACHE).then(function (cache) {
return cache.match(request).then(function (matching) {
return matching || Promise.reject('no-match');
});
});
}
Here is the error message I am getting in Chrome when I go offline:
(A similar error occurred in Firefox - it falls over at line 409 of ServiceWorkerWare.js)
ServiceWorkerWare.prototype.executeMiddleware = function (middleware,
request) {
var response = this.runMiddleware(middleware, 0, request, null);
response.catch(function (error) { console.error(error); });
return response;
};
this is a little more advanced that a beginner level. But you will need to detect when you are offline or in a Li-Fi state. Instead of POSTing data to an API or end point you need to queue that data to be synched when you are back on line.
This is what the Background Sync API should help with. However, it is not supported across the board just yet. Plus Safari.........
So maybe a good strategy is to persist your data in IndexedDB and when you can connect (background sync fires an event for this) you would then POST the data. It gets a little more complex for browsers that don't support service workers (Safari) or don't yet have Background Sync (that will level out very soon).
As always design your code to be a progressive enhancement, which can be tricky, but worth it in the end.
Service Workers tend to cache the static HTML, CSS, JavaScript, and image files.
I need to use PouchDB and sync it with CouchDB
Why CouchDB?
CouchDB is a NoSQL database consisting of a number of Documents
created with JSON.
It has versioning (each document has a _rev
property with the last modified date)
It can be synchronised with
PouchDB, a local JavaScript application that stores data in local
storage via the browser using IndexedDB. This allows us to create
offline applications.
The two databases are both “master” copies of
the data.
PouchDB is a local JavaScript implementation of CouchDB.
I still need a better answer than my partial notes towards a solution!
Yes, this type of service worker is the correct one to use for saving form data offline.
I have now edited it and understood it better. It caches the form data, and loads it on the page for the user to see what they have entered.
It is worth noting that the paths to the library files will need editing to reflect your local directory structure, e.g. in my setup:
importScripts('/js/lib/ServiceWorkerWare.js');
importScripts('/js/lib/localforage.js');
The script is still failing when offline, however, as it isn't caching the library files. (Update to follow when I figure out caching)
Just discovered an extra debugging tool for service workers (apart from the console): chrome://serviceworker-internals/. In this, you can start or stop service workers, view console messages, and the resources used by the service worker.
Since the upgrade to iOS 6, we are seeing Safari's web view take the liberty of caching $.ajax calls. This is in the context of a PhoneGap application so it is using the Safari WebView. Our $.ajax calls are POST methods and we have cache set to false {cache:false}, but still this is happening. We tried manually adding a TimeStamp to the headers but it did not help.
We did more research and found that Safari is only returning cached results for web services that have a function signature that is static and does not change from call to call. For instance, imagine a function called something like:
getNewRecordID(intRecordType)
This function receives the same input parameters over and over again, but the data it returns should be different every time.
Must be in Apple's haste to make iOS 6 zip along impressively they got too happy with the cache settings. Has anyone else seen this behavior on iOS 6? If so, what exactly is causing it?
The workaround that we found was to modify the function signature to be something like this:
getNewRecordID(intRecordType, strTimestamp)
and then always pass in a TimeStamp parameter as well, and just discard that value on the server side. This works around the issue.
After a bit of investigation, turns out that Safari on iOS6 will cache POSTs that have either no Cache-Control headers or even "Cache-Control: max-age=0".
The only way I've found of preventing this caching from happening at a global level rather than having to hack random querystrings onto the end of service calls is to set "Cache-Control: no-cache".
So:
No Cache-Control or Expires headers = iOS6 Safari will cache
Cache-Control max-age=0 and an immediate Expires = iOS6 Safari will cache
Cache-Control: no-cache = iOS6 Safari will NOT cache
I suspect that Apple is taking advantage of this from the HTTP spec in section 9.5 about POST:
Responses to this method are not cacheable, unless the response
includes appropriate Cache-Control or Expires header fields. However,
the 303 (See Other) response can be used to direct the user agent to
retrieve a cacheable resource.
So in theory you can cache POST responses...who knew. But no other browser maker has ever thought it would be a good idea until now. But that does NOT account for the caching when no Cache-Control or Expires headers are set, only when there are some set. So it must be a bug.
Below is what I use in the right bit of my Apache config to target the whole of my API because as it happens I don't actually want to cache anything, even gets. What I don't know is how to set this just for POSTs.
Header set Cache-Control "no-cache"
Update: Just noticed that I didn't point out that it is only when the POST is the same, so change any of the POST data or URL and you're fine. So you can as mentioned elsewhere just add some random data to the URL or a bit of POST data.
Update: You can limit the "no-cache" just to POSTs if you wish like this in Apache:
SetEnvIf Request_Method "POST" IS_POST
Header set Cache-Control "no-cache" env=IS_POST
I hope this can be of use to other developers banging their head against the wall on this one. I found that any of the following prevents Safari on iOS 6 from caching the POST response:
adding [cache-control: no-cache] in the request headers
adding a variable URL parameter such as the current time
adding [pragma: no-cache] in the response headers
adding [cache-control: no-cache] in the response headers
My solution was the following in my Javascript (all my AJAX requests are POST).
$.ajaxSetup({
type: 'POST',
headers: { "cache-control": "no-cache" }
});
I also add the [pragma: no-cache] header to many of my server responses.
If you use the above solution be aware that any $.ajax() calls you make that are set to global: false will NOT use the settings specified in $.ajaxSetup(), so you will need to add the headers in again.
Simple solution for all your web service requests, assuming you're using jQuery:
$.ajaxPrefilter(function (options, originalOptions, jqXHR) {
// you can use originalOptions.type || options.type to restrict specific type of requests
options.data = jQuery.param($.extend(originalOptions.data||{}, {
timeStamp: new Date().getTime()
}));
});
Read more about the jQuery prefilter call here.
If you aren't using jQuery, check the docs for your library of choice. They may have similar functionality.
I just had this issue as well in a PhoneGap application. I solved it by using the JavaScript function getTime() in the following manner:
var currentTime = new Date();
var n = currentTime.getTime();
postUrl = "http://www.example.com/test.php?nocache="+n;
$.post(postUrl, callbackFunction);
I wasted a few hours figuring this out. It would have been nice of Apple to notify developers of this caching issue.
I had the same problem with a webapp getting data from ASP.NET webservice
This worked for me:
public WebService()
{
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.NoCache);
...
}
Finally, I've a solution to my uploading problem.
In JavaScript:
var xhr = new XMLHttpRequest();
xhr.open("post", 'uploader.php', true);
xhr.setRequestHeader("pragma", "no-cache");
In PHP:
header('cache-control: no-cache');
From my own blog post iOS 6.0 caching Ajax POST requests:
How to fix it: There are various methods to prevent caching of requests. The recommended method is adding a no-cache header. This is how it is done.
jQuery:
Check for iOS 6.0 and set Ajax header like this:
$.ajaxSetup({ cache: false });
ZeptoJS:
Check for iOS 6.0 and set the Ajax header like this:
$.ajax({
type: 'POST',
headers : { "cache-control": "no-cache" },
url : ,
data:,
dataType : 'json',
success : function(responseText) {…}
Server side
Java:
httpResponse.setHeader("Cache-Control", "no-cache, no-store, must-revalidate");
Make sure to add this at the top the page before any data is sent to the client.
.NET
Response.Cache.SetNoStore();
Or
Response.Cache.SetCacheability(System.Web.HttpCacheability.NoCache);
PHP
header('Cache-Control: no-cache, no-store, must-revalidate'); // HTTP 1.1.
header('Pragma: no-cache'); // HTTP 1.0.
This JavaScript snippet works great with jQuery and jQuery Mobile:
$.ajaxSetup({
cache: false,
headers: {
'Cache-Control': 'no-cache'
}
});
Just place it somewhere in your JavaScript code (after jQuery is loaded, and best before you do AJAX requests) and it should help.
You can also fix this issue by modifying the jQuery Ajax function by doing the following (as of 1.7.1) to the top of the Ajax function (function starts at line 7212). This change will activate the built-in anti-cache feature of jQuery for all POST requests.
(The full script is available at http://dl.dropbox.com/u/58016866/jquery-1.7.1.js.)
Insert below line 7221:
if (options.type === "POST") {
options.cache = false;
}
Then modify the following (starting at line ~7497).
if (!s.hasContent) {
// If data is available, append data to URL
if (s.data) {
s.url += (rquery.test(s.url) ? "&" : "?") + s.data;
// #9682: remove data so that it's not used in an eventual retry
delete s.data;
}
// Get ifModifiedKey before adding the anti-cache parameter
ifModifiedKey = s.url;
// Add anti-cache in URL if needed
if (s.cache === false) {
var ts = jQuery.now(),
// Try replacing _= if it is there
ret = s.url.replace(rts, "$1_=" + ts);
// If nothing was replaced, add timestamp to the end.
s.url = ret + ((ret === s.url) ? (rquery.test(s.url) ? "&" : "?") + "_=" + ts : "");
}
}
To:
// More options handling for requests with no content
if (!s.hasContent) {
// If data is available, append data to URL
if (s.data) {
s.url += (rquery.test(s.url) ? "&" : "?") + s.data;
// #9682: remove data so that it's not used in an eventual retry
delete s.data;
}
// Get ifModifiedKey before adding the anti-cache parameter
ifModifiedKey = s.url;
}
// Add anti-cache in URL if needed
if (s.cache === false) {
var ts = jQuery.now(),
// Try replacing _= if it is there
ret = s.url.replace(rts, "$1_=" + ts);
// If nothing was replaced, add timestamp to the end.
s.url = ret + ((ret === s.url) ? (rquery.test(s.url) ? "&" : "?") + "_=" + ts : "");
}
A quick work-around for GWT-RPC services is to add this to all the remote methods:
getThreadLocalResponse().setHeader("Cache-Control", "no-cache");
This is an update of Baz1nga's answer. Since options.data is not an object but a string I just resorted to concatenating the timestamp:
$.ajaxPrefilter(function (options, originalOptions, jqXHR) {
if (originalOptions.type == "post" || options.type == "post") {
if (options.data && options.data.length)
options.data += "&";
else
options.data = "";
options.data += "timeStamp=" + new Date().getTime();
}
});
In order to resolve this issue for WebApps added to the home screen, both of the top voted workarounds need to be followed. Caching needs to be turned off on the webserver to prevent new requests from being cached going forward and some random input needs to be added to every post request in order for requests that have already been cached to go through. Please refer to my post:
iOS6 - Is there a way to clear cached ajax POST requests for webapp added to home screen?
WARNING: to anyone who implemented a workaround by adding a timestamp to their requests without turning off caching on the server. If your app is added to the home screen, EVERY post response will now be cached, clearing safari cache doesn't clear it and it doesn't seem to expire. Unless someone has a way to clear it, this looks like a potential memory leak!
Things that DID NOT WORK for me with an iPad 4/iOS 6:
My request containing: Cache-Control:no-cache
//asp.net's:
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.NoCache)
Adding cache: false to my jQuery ajax call
$.ajax(
{
url: postUrl,
type: "POST",
cache: false,
...
Only this did the trick:
var currentTime = new Date();
var n = currentTime.getTime();
postUrl = "http://www.example.com/test.php?nocache="+n;
$.post(postUrl, callbackFunction);
That's the work around for GWT-RPC
class AuthenticatingRequestBuilder extends RpcRequestBuilder
{
#Override
protected RequestBuilder doCreate(String serviceEntryPoint)
{
RequestBuilder requestBuilder = super.doCreate(serviceEntryPoint);
requestBuilder.setHeader("Cache-Control", "no-cache");
return requestBuilder;
}
}
AuthenticatingRequestBuilder builder = new AuthenticatingRequestBuilder();
((ServiceDefTarget)myService).setRpcRequestBuilder(builder);
My workaround in ASP.NET (pagemethods, webservice, etc.)
protected void Application_BeginRequest(object sender, EventArgs e)
{
Response.Cache.SetCacheability(HttpCacheability.NoCache);
}
While adding cache-buster parameters to make the request look different seems like a solid solution, I would advise against it, as it would hurt any application that relies on actual caching taking place. Making the APIs output the correct headers is the best possible solution, even if that's slightly more difficult than adding cache busters to the callers.
For those that use Struts 1, here is how I fixed the issue.
web.xml
<filter>
<filter-name>SetCacheControl</filter-name>
<filter-class>com.example.struts.filters.CacheControlFilter</filter-class>
</filter>
<filter-mapping>
<filter-name>SetCacheControl</filter-name>
<url-pattern>*.do</url-pattern>
<http-method>POST</http-method>
</filter-mapping>
com.example.struts.filters.CacheControlFilter.js
package com.example.struts.filters;
import java.io.IOException;
import java.util.Date;
import javax.servlet.*;
import javax.servlet.http.HttpServletResponse;
public class CacheControlFilter implements Filter {
public void doFilter(ServletRequest request, ServletResponse response,
FilterChain chain) throws IOException, ServletException {
HttpServletResponse resp = (HttpServletResponse) response;
resp.setHeader("Expires", "Mon, 18 Jun 1973 18:00:00 GMT");
resp.setHeader("Last-Modified", new Date().toString());
resp.setHeader("Cache-Control", "no-store, no-cache, must-revalidate, max-age=0, post-check=0, pre-check=0");
resp.setHeader("Pragma", "no-cache");
chain.doFilter(request, response);
}
public void init(FilterConfig filterConfig) throws ServletException {
}
public void destroy() {
}
}
I was able to fix my problem by using a combination of $.ajaxSetup and appending a timestamp to the url of my post (not to the post parameters/body). This based on the recommendations of previous answers
$(document).ready(function(){
$.ajaxSetup({ type:'POST', headers: {"cache-control","no-cache"}});
$('#myForm').submit(function() {
var data = $('#myForm').serialize();
var now = new Date();
var n = now.getTime();
$.ajax({
type: 'POST',
url: 'myendpoint.cfc?method=login&time='+n,
data: data,
success: function(results){
if(results.success) {
window.location = 'app.cfm';
} else {
console.log(results);
alert('login failed');
}
}
});
});
});
I think you have already resolved your issue, but let me share an idea about web caching.
It is true you can add many headers in each language you use, server side, client side, and you can use many other tricks to avoid web caching, but always think that you can never know from where the client are connecting to your server, you never know if he are using a Hotel “Hot-Spot” connection that uses Squid or other caching products.
If the users are using proxy to hide his real position, etc… the real only way to avoid caching is the timestamp in the request also if is unused.
For example:
/ajax_helper.php?ts=3211321456
Then every cache manager you have to pass didnt find the same URL in the cache repository and go re-download the page content.
Depending on the app you can trouble shoot the issue now in iOS 6 using Safari>Advanced>Web Inspector so that is helpful with this situation.
Connect the phone to Safari on a Mac an then use the developer menu to trouble shoot the web app.
Clear the website data on the iPhone after update to iOS6, including specific to the app using a Web View. Only one app had an issue and this solved it during IOS6 Beta testing way back, since then no real problems.
You may need to look at your app as well, check out NSURLCache if in a WebView in a custom app.
https://developer.apple.com/library/ios/#documentation/Cocoa/Reference/Foundation/Classes/NSURLCache_Class/Reference/Reference.html#//apple_ref/doc/uid/TP40003754
I guess depending on the true nature of your problem, implementation, etc. ..
Ref: $.ajax calls
I found one workaround that makes me curious as to why it works. Before reading Tadej's answer concerning ASP.NET web service, I was trying to come up with something that would work.
And I'm not saying that it's a good solution, but I just wanted to document it here.
main page: includes a JavaScript function, checkStatus(). The method calls another method which uses a jQuery AJAX call to update the html content. I used setInterval to call checkStatus(). Of course, I ran into the caching problem.
Solution: use another page to call the update.
On the main page, I set a boolean variable, runUpdate, and added the following to the body tag:
<iframe src="helper.html" style="display: none; visibility: hidden;"></iframe>
In the of helper.html:
<meta http-equiv="refresh" content="5">
<script type="text/javascript">
if (parent.runUpdate) { parent.checkStatus(); }
</script>
So, if checkStatus() is called from the main page, I get the cached content. If I call checkStatus from the child page, I get updated content.
While my login and signup pages works like a charm in Firefox, IE and Chrome... I've been struggling with this issue in Safari for IOS and OSX, few months ago I found a workaround on the SO.
<body onunload="">
OR via javascript
<script type="text/javascript">
window.onunload = function(e){
e.preventDefault();
return;
};
</script>
This is kinda ugly thing but works for a while.
I don't know why, but returning null to the onunload event the page do not get cached in Safari.
We found that older iPhones and iPads, running iOS versions 9 & 10, occasionally return bogus blank AJAX results, perhaps due to Apple's turning down CPU speed. When returning the blank result, iOS does not call the server, as if returning a result from cache. Frequency varies widely, from roughly 10% to 30% of AJAX calls return blank.
The solution is hard to believe. Just wait 1s and call again. In our testing, only one repeat was all that was ever needed, but we wrote the code to call up to 4 times. We're not sure if the 1s wait is required, but we didn't want to risk burdening our server with bursts of repeated calls.
We found the problem happened with two different AJAX calls, calling on different API files with different data. But I'm concerned it could happen on any AJAX call. We just don't know because we don't inspect every AJAX result and we don't test every call multiple times on old devices.
Both problem AJAX calls were using: POST, Asynchronously = true, setRequestHeader = ('Content-Type', 'application/x-www-form-urlencoded')
When the problem happens, there's usually only one AJAX call going on. So it's not due to overlapping AJAX calls. Sometimes the problem happens when the device is busy, but sometimes not, and without DevTools we don't really know what's happening at the time.
iOS 13 doesn't do this, nor Chrome or Firefox. We don't have any test devices running iOS 11 or 12. Perhaps someone else could test those?
I'm noting this here because this question is the top Google result when searching for this problem.
It worked with ASP.NET only after adding the pragma:no-cache header in IIS. Cache-Control: no-cache was not enough.
I suggest a workaround to modify the function signature to be something like this:
getNewRecordID(intRecordType, strTimestamp)
and then always pass in a TimeStamp parameter as well, and just discard that value on the server side. This works around the issue.