XMLHttpRequest cannot load file? - xml-parsing

d3.csv("result.csv", function(flights) {
var nestByDate = d3.nest()
.key(function(d) { return d3.time.day(d.date); });
..........
When I am trying to run above d3.js code from web server then it executes d3.js properly by loading csv file.
but when I am trying to run d3.js as shown below,
d3.csv("D:\\Project Space\\D3Demo\\WebContent\\result.csv", function(flights) {
var nestByDate = d3.nest()
.key(function(d) { return d3.time.day(d.date); });
..........
then it shows following error:
XMLHttpRequest cannot load file:///D:/Project%20Space/D3Demo/WebContent/result.csv. Cross origin requests are only supported for HTTP`
How to solve this problem ?

There is no way to solve the problem using D3's convenience functions.
d3.csv fundamentally is an AJAX request and is beholden to the same-origin policy.
When you load the file location, your browser realizes that the requested file does not exist on the same domain (likely localhost in your case) and prevents the request from completing.
A simple way to get around this would be to simply serve the content over localhost or whatever you are using.
Alternatively you can look in to Cross-origin Resource Sharing, or for better compatibility: JSONP. In both of these cases you will likely have to roll your own function to convert the CSV data into a javascript array.

Related

Rails request.headers data not updating without refresh

When I hit site for first time, then results of request.headers[HTTP_ACCEPT] is "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8"
but between internal links requests it shows text/html, application/xhtml+xml
until I hard refresh the page.
Is it due to turbolinks or any other issue?
It's a turbolinks issue. By default, the turbolinks' XMLHttpRequest, only send the headers "text/html, application/xhtml+xml", but you can send the "image/webp" header by capturing the request-start event like this:
document.addEventListener("turbolinks:request-start", function(event) {
var xhr = event.data.xhr
xhr.setRequestHeader("Accept", "image/webp")
})
The question is how to know from javascript whether or not the browser accepts webp images to send the "image/webp" header to the server.
You can use the Modernize library or a custom function that guess if webp is available when trying to decode a small webp image like described here: https://developers.google.com/speed/webp/faq. However, these solutions add a certain overhead that precludes the benefit of the fastest load time of webp images.
Maybe, the most efficient solution is to create a boolean cookie in the server side to store whether or not webp images are accepted. The cookie is stored with the first browser GET call to the server. In subsequent turbolinks calls you can check this cookie in your javascript code:
document.addEventListener("turbolinks:request-start", function(event) {
browserAcceptsWebp = document.cookie.includes('webp_available=true');
if (browserAcceptsWebp) {
var xhr = event.data.xhr;
xhr.setRequestHeader("Accept", "image/webp");
}
});

404 not found with api call - Angular to RoR

I am a newbie with Ruby on Rails and I am trying to figure out ways to connect Angular to RoR in a very simple way
Here is my service
mWebApp.service('mWebSrvc', function($http, $log) {
this.getCustomers = function() {
$http({
method : 'GET',
url : 'http://127.0.0.1:3000/api/customers/'
}).success(function(data, status, headers, config) {
$log.log('Done');
angular.forEach(data, function(c) {
$log.log(c.Title);
});
customers = data;
return customers;
});
};
});
When I look under the Net tab in Firebug, I see OPTIONS /api/customers/ 404 Not Found, but if I click on the Response tab within, then I see the JSON file - WTF? And not the JSON tab - again, WTF?
Under Firebug's console -
"NetworkError: 404 Not Found - http://numberForLocalHost:3000/api/customers/"
My Rails server is running in daemon mode - numberForLocalHost:3000 - is this what the issue might be? That it should be calling a true api
If I paste the URL above into any web browser, then I can see the JSON
As usual, thanks in advance
You're getting an OPTIONS request because your browser believes this is a cross origin request.
See this question for example. Is your RoR app also serving your client side angular? If not, you should decide whether it can be (there shouldn't be a reason not to), or you need to reply to the pre-flight OPTIONS request from your server that you are seeing.
I had the very same issue with my Rails + Angular app. I had my cors well set up in my rails app but still nothing, I still got a 404 not found in the angular app. This could be the reason:
Perhaps you have "angular-in-memory-web-api": '0.x.x' in your package.json and also imported in your app.module.ts, as InMemoryWebApiModule and InMemoryDataService. These apparently intercept all calls to an API preventing them from ever reaching your back-end server. When I Removed those dependencies and their declarations, all of a sudden my app started working normally!
Look at this answer for more information.

Grails HTTP Proxy

I want to create a proxy controller in grails, something that just takes whatever is passed in based on a url mapping, records what was asked for, sends the request to another server, records the response, and send the response back to the browser.
I'm having trouble with when the request has an odd file extension (.gif) or no file extension (/xxx?sdcscd)
My url mapping is:
"/proxy/$target**"
and I've attempted (per an answer to another question):
def targetURL = params.target
if (!FilenameUtils.getExtension(targetURL) && request.format) {
targetURL += ".${response.format}"
}
but this usually appends .html and never the .gif or ?csdcsd
Not sure what to do as I might just write the thing in straight Java
Actually, the real answer was sitting in the post you linked to previously all along, by Peter Ledbrook:
Disable file extension truncation by adding this line to grails-app/conf/Config.groovy:
grails.mime.file.extensions = false
This will disable the usage of file extensions for format, but will leave the file extension on params.target. You can completely ignore response.format!

WebKit image reload on Post/Redirect/Get

We just redesigned a web application and as a result we've noticed a bug in Chrome (but it supposedly affects all WebKit browsers) that causes a full image/js/css reload after a Post/Redirect/Get. Our app is built using ASP.NET and uses a lot of Response.Redirect's which means users will run into this issue a lot. There's a bug report for the issue with test case: https://bugs.webkit.org/show_bug.cgi?id=38690
We've tried the following to resolve the issue:
Change all Response.Redirects to be JavaScript redirects. This wasn't ideal because instead of images reloading, there would be a "white flash" during page transitions.
We wrote our own HTTP handler for images, CSS and JS files. We set it up to where the handler sends a max-age of 1 hour. When the client requests the file again, the handler checks the If-Modified-Since header sent by the browser to see if the file had been updated since the last time it was downloaded. If the dates match, the handler returns an HTTP 302 (Not Modified) with 0 for the Content-Length. We ran a test where if the image was downloaded for the first time (HTTP 200), there was a delay of 10 seconds. So the first time the page loaded, it was very slow. If the handler returned 302 (Not Modified), there was no delay. What we noticed was that Chrome would still "reload" images even when the server returned a 302 (Not Modified). It's not pulling the file from the server (if it were, it would cause a 10 seconds delay), but yet it's flashing/reloading the images. So Chrome seems to be ignoring the 302 and still reloading images from it's cache causing the "reload".
We've checked big sites to see if they've fixed it somehow, but sites like NewEgg and Amazon are also affected.
Has anyone found a solution to this? Or a way to minimize the effect?
Thanks.
This is a bug. The only "workaround" I've seen untill now is to use a Refresh header instead of a Location header to do the redirecting. This is far from ideal.
Bug 38690 - Submitting a POST that leads to a server redirect causes all cached items to redownload
Also, this question is a duplicate of "Full page reload on Post/Redirect/Get ignoring cache control".
I ran into this problem myself with an ASP.NET web forms site that uses
Response.Redirect(url, false) following a post on many of its pages.
From reading the HTTP/1.1 specification it sounds like a 303 response code would be correct for implementing the Request: POST, Response: Redirect behavior. Unfortunately changing the status code does not make browser caching work in Chrome.
I implemented the workaround described in the post above by creating a custom module for non-static content. I'm also deleting the response content from 302's to avoid the appearance of a blink of "object moved to here". This is probably only relevant for the refresh headers. Comments are welcome!
public class WebKitHTTPHeaderFixModule : IHttpModule
{
public void Init(HttpApplication httpApp)
{
// Attach application event handlers.
httpApp.PreSendRequestHeaders += new EventHandler(httpApp_PreSendRequestHeaders);
}
void httpApp_PreSendRequestHeaders(object sender, EventArgs e)
{
HttpContext context = HttpContext.Current;
if (context.Response.StatusCode == 302)
{
context.Response.ClearContent();
// If Request is POST and Response is 302 and browser is Webkit use a refresh header
if (context.Request.HttpMethod.Equals("POST", StringComparison.OrdinalIgnoreCase) && context.Request.Headers["User-Agent"].ToLower().Contains("webkit"))
{
string location = context.Response.Headers["Location"];
context.Response.StatusCode = 200;
context.Response.AppendHeader("Refresh", "0; url=" + location);
}
}
}
public void Dispose()
{}
}
Note: I don't think this will work with the non-overloaded version of Response.Redirect since it calls Response.End().

Jquery-upload-progress cross domain issue. Suspected GET request problem

I am doing a site which submits a form to a different server. For upload progress tracking I use: for server side the NginxHttpUploadProgressModule und for client side - jquery-upload-progress. I have tested the setup by submitting the form to the same server and everything worked fine. Submitting to another server doesn't show the progress tracking(cross domain scripting). After hours of investigating this matter I came to the conclusion that the GET request generated by JQuery is at fault.
The query looks like this:
http://domain.com/upload/progress/?X-Progress-ID=39b2825934dbb2f33fe936df734ff840&callback=jsonp1249230337707&_=1249230345572
From the NginxHttpUploadProgressModule site:
The HTTP request to this location must have either an X-Progress-ID parameter or X-Progress-ID HTTP header containing the unique identifier as specified in your upload/POST request to the relevant tracked zone. If you are using the X-Progress-ID as a query-string parameter, ensure it is the LAST argument in the URL.
So, my question is how do I append the X-Progress-ID parameter to the end of the jquery GET request or set the X-Progress-ID header?
This doesn't work with jsonp(code from jquery.uploadProgress.js):
beforeSend: function(xhr) {
xhr.setRequestHeader("X-Progress-ID", options.uuid);
}
Currently the request is generated this way(code from jquery.uploadProgress.js):
jQuery.uploadProgress = function(e, options) {
jQuery.ajax({
type: "GET",
url: options.progressUrl + "?X-Progress-ID=" + options.uuid,
dataType: options.dataType,
success: function(upload) {
...
I solved the GET parameter problem(code from jquery.uploadProgress.js):
jQuery.uploadProgress = function(e, options) {
jQuery.ajax({
type: "GET",
url: options.progressUrl,
dataType: options.dataType,
data: "X-Progress-ID=" + options.uuid,
success: function(upload) {
...
Modified GET request looks like this:
http://domain.com/upload/progress/?callback=jsonp1249230337707&_=1249230345572&X-Progress-ID=39b2825934dbb2f33fe936df734ff840
The nginx webserver is now correctly responding.
However as Ron Evans pointed out the client side progress tracking part won't work unless NginxHttpUploadProgressModule is modified.
You simply cannot fire an XmlHttpRequest from a webpage, to a domain different from the page's domain. It violates security definitions that are default on all browsers.
the only thing that I can think of that you can do is to use Flash or Silverlight to initiate the progress calls (Flash and Silverlight can, given the correct crossdomain.xml setup, send async requests from the browser to preset list of domains)
or, setup a browser addin (say Firefox plugin, or IE ActiveX, or Embedded WinForm control) that can initiate calls without the same-domain restriction (as the request will not originate from the webpage, but from the browser itself)
You need to install the Apache module for upload status as well, just using the jQuery plugin will not work.
To respond to Ken, I suggest you familiarize yourself with JSONP spec, since JSONP was created specifically to handle cross-domain Javascript calls.
Anyhow, this code works great in Passenger/Apache WITH my modified Apache module. Without modifying the extension for Nginx it will not work with a JSONP call.
I made a minor modification that solved the problem for me, you can check it out here:
http://github.com/tizoc/nginx-upload-progress-module/commit/a40b89f63b5a767faec3c78d826443a94dc5b126

Resources