Im able to do Http requests (POST/GET) with XMLHttpRequest.
I'm asking how to do requests with URLs like "https://www.gmail.com"
I was trying something like this but the status code is 0
var http = new XMLHttpRequest();
var url = "https://www.gmail.com";
http.open("GET", url);
http.onreadystatechange = function() {//Call a function when the state changes.
if(http.readyState == 4 && http.status == 200) {
//alert(http.responseText);
print("ok")
}else{
print("cannot connect")
print("code:" + http.status)
print(http.responseText)
}
}
http.send(null);
I get always "cannot connect" "code" 0 and nothing as response
Any idea?
This is going to fail for two reasons:
1) The url "https://www.gmail.com" actually tries to redirect you to "https://mail.google.com/mail/", which in turn will try to redirect you to a login page. This redirect is not being listed as an error
2) However more importantly, you cannot make XMLHttpRequests to a different domain, unless that domain supports CORS (http://www.html5rocks.com/en/tutorials/cors/). GMail does not support CORS, so this request will not work.
Related
I'm working with an API, which after filling out the log in form on their website, redirects back to our website, with a unique code at the end of the URL.
Example URL after redirect:
https://www.mywebsite.com/?code=12431453154545
I have been unable to find a way of viewing this URL in Postman.
Ideally I need to be able to work with that URL to extract the code and store it as a variable.
Any help will be muchly appreciated. I've been trying this all day :( .
When you turn off following redirects in Postman settings, you will be able to inspect 3xx HTTP response which will contain Location header with the URL you want to read.
const url = require("url");
var location = pm.response.headers.get("location");
if (typeof location !== typeof undefined) {
var redirectUrl = url.parse(location, true);
query = redirectUrl.query;
if ("code" in query) {
pm.globals.set("code", query.code);
console.log(pm.globals.get("code"));
}
}
Note that this solution will not work when multiple subsequent redirects happen as you will be inspecting only the first 3xx response. You could solve this by following redirects manually and sending your own requests from Postman script as described in Postman manual.
I've tried to read as many different answers and posts as possible, but I still can't quite settle on a solution that fits my needs. I'm trying to work out the best (most efficient, but mostly more secure) way to handle user authentication, log in, etc.
I have a Node.js server, running on Express; I have an Angular.js web app; and I have an iOS app. I expose a RESTful API with Express/Node.js.
Cookies
The first things I read said to use cookies, and to store a session id/login token on the server side (hashed) and on the client side (unhashed). The client would transfer this id with each request, the server would hash it, parse it and process the request accordingly. This does not feel RESTful (not a huge issue), but more importantly, would I have to duplicate my API: one for username/password authentication (e.g. done via curl) and one for cookie-based authentication (e.g. my web app)?
Another problem with this: what I would do if I had multiple connections from the one user, e.g. they're logged in in two browsers, an iPhone and an iPad. Would my storage of their session ids need to now be an array?
HTTP Basic Auth
The next idea was to use HTTP Basic Auth (with SSL), which seems easy enough, but is not recommended because you need to transfer a username and password with each request. If I were to do it with HTTP Basic Auth, would I then store the username and password in cookies (or HTML local storage) to allow for 'Remember Me' functionality? Or could I combine the two: use HTTP Basic Auth for the actual requests (post a new post, etc.) and just use a session id stored in a cookie for the initial log in sequence/remember me aspects?
Is transmitting a session id more secure than just transmitting the user's password? How?
The session id is going to act ostensibly as a password, so to me transmitting it would have the same security issues as transmitting a password.
Basic Auth seems to be supported across all platforms, which is ideal. The main downside seems to be needing to transfer client authentication data with each request. Is there a way to mitigate this issue?
OAuth
OAuth seems like overkill for my needs. I think I would lose the ability to do curl commands to test my API. How is OAuth an improvement over the cookies method?
As you can probably tell, I'm a little confused by the diverse information available, so if you have a set of good links—applicable to this scenario—I would love to read them. I'm trying to find a solution that fits across all platforms, but is still as secure as possible. Also, if I have any of my terminology wrong, please correct me because it will make searching easier for me.
Thanks.
Update:
I've been thinking about this problem, and I've had an idea. Please tell me if this is dumb/insecure/any feedback, because I'm not sure if it's good.
When the user logs in, we generate a random session id (salted etc.). This optional session id is sent to the client, which the client can store (e.g. in cookies) if they choose; the session id is stored in the database.
This session id is then optionally sent with each request as either an HTTP Authentication header or query string, or the client can just send the username and password if they want (which gives us our regular REST API). At the server end, we check first for a session id parameter, if it's not present, we check for username/password. If neither are there—error.
On the server, we check that the session id is associated with the correct username. If it is, we complete the request.
Every time the user logs in, we create a new session id or delete the current one, and send this with the response to the log in request.
I think this lets me use the regular REST API, where appropriate, with Basic Auth, and maintain sessions/remember me functionality. It doesn't solve the multiple log ins issue, but otherwise I think this way should would. Please let me know.
I would use a token based authentication where you can send a token (automatically) with each request. You'll have to log in once, the server will provide you with a token which you can then use to send with each request. This token will be added to the HTML header, so that you don't have to modify each request to the browser.
You can set certain calls in the API so that they always need a token, while others might not be token protected.
For Express, you can use express-jwt (https://www.npmjs.org/package/express-jwt)
var expressJwt = require('express-jwt');
// Protect the /api routes with JWT
app.use('/api', expressJwt({secret: secret}));
app.use(express.json());
app.use(express.urlencoded());
If you want to authenticate you can create this function in your express server:
app.post('/authenticate', function (req, res) {
//if is invalid, return 401
if (!(req.body.username === 'john.doe' && req.body.password === 'foobar')) {
res.send(401, 'Wrong user or password');
return;
}
var profile = {
first_name: 'John',
last_name: 'Doe',
email: 'john#doe.com',
id: 123
};
// We are sending the profile inside the token
var token = jwt.sign(profile, secret, { expiresInMinutes: 60*5 });
res.json({ token: token });
});
And for protected calls something that starts with /api:
app.get('/api/restricted', function (req, res) {
console.log('user ' + req.user.email + ' is calling /api/restricted');
res.json({
name: 'foo'
});
});
In your Angular application you can login with:
$http
.post('/authenticate', $scope.user)
.success(function (data, status, headers, config) {
$window.sessionStorage.token = data.token;
$scope.message = 'Welcome';
})
.error(function (data, status, headers, config) {
// Erase the token if the user fails to log in
delete $window.sessionStorage.token;
// Handle login errors here
$scope.message = 'Error: Invalid user or password';
});
And by creating an authentication interceptor, it will automatically send the token with every request:
myApp.factory('authInterceptor', function ($rootScope, $q, $window) {
return {
request: function (config) {
config.headers = config.headers || {};
if ($window.sessionStorage.token) {
config.headers.Authorization = 'Bearer ' + $window.sessionStorage.token;
}
return config;
},
response: function (response) {
if (response.status === 401) {
// handle the case where the user is not authenticated
}
return response || $q.when(response);
}
};
});
myApp.config(function ($httpProvider) {
$httpProvider.interceptors.push('authInterceptor');
});
If you have to support old browsers which do not support local storage. You can swap the $window.sessionStorage with a library like AmplifyJS (http://amplifyjs.com/). Amplify for example uses whatever localstorage is available. This would translate in something like this:
if (data.status === 'OK') {
//Save the data using Amplify.js
localStorage.save('sessionToken', data.token);
//This doesn't work on the file protocol or on some older browsers
//$window.sessionStorage.token = data.token;
$location.path('/pep');
}
}).error(function (error) {
// Erase the token if the user fails to log in
localStorage.save('sessionToken', null);
// Handle login errors here
$scope.message = 'Error: Invalid user or password';
});
And the authintercepter we swap for:
angular.module('myApp.authInterceptor', ['myApp.localStorage']).factory('authInterceptor', [
'$rootScope',
'$q',
'localStorage',
function ($rootScope, $q, localStorage) {
return {
request: function (config) {
config.headers = config.headers || {};
config.headers.Authorization = 'Bearer ' + localStorage.retrieve('sessionToken');
return config;
},
response: function (response) {
if (response.status === 401) {
}
return response || $q.when(response);
}
};
}
]);
You can find everything except AmplifyJS in this article:
http://blog.auth0.com/2014/01/07/angularjs-authentication-with-cookies-vs-token/
Have a look to the yeoman generator for angular and node? The generator-angular-fullstack have a very nice structure for user authentification using passport.
You can see an example here :
the code: https://github.com/DaftMonk/fullstack-demo
the result: http://fullstack-demo.herokuapp.com/
Hope it helps!
I use generator-angular-fullstack, the /api services are not secured, get your _id from /api/users/me, logout, and go to /api/users/your_id_here, you will figure out that the /api not secured.
When I write this Dart code :
for(int i=0;i<nbAleas;i++){
HttpRequest request=new HttpRequest();
// end of request event
request.onReadyStateChange.listen((_) {
if (request.readyState == HttpRequest.DONE &&
(request.status == 200 || request.status == 0)) {
handleResponse(request.responseText);
}
});
// error event
request.onError.listen((Object error)=>handleError(error));
// method and url
request.open("GET", urlServiceRest);
// send the request
request.send();
}
the request is sent only once. I verified it on the server. If I modify the opening like this :
request.open("GET", urlServiceRest, async:false);
it works. Why should the requests be synchronous ?
Also, the above requests are made to the same URL with the same parameters, for example "localhost:8080/random/10/20". If I send to async requests to this URL, only one is sent as said above. If for the second request, I change some parameters "localhost:8080/random/11/21", the two async requests are sent.
Can anyone explain this strange behavior ? Thanks in advance.
The same GET requests will definitely be a candidate for caching by the browser. In addition to appending some random junk to the URL, you could try switching to POST requests; which are not cacheable unless the response includes appropriate Cache-Control or Expires header fields.
So Im messing around with developing a spotify app, trying to get it to talk to my local rails application API. I cant get anything other than a req.status 0 when I try it.
I think its either a problem with the spotify manifest.json file, not allowing the port:3000 to go on the url you set in required permissions, but it also says the following in their documentation.
https://developer.spotify.com/technologies/apps/tutorial/
If you need to talk to an outside web API you're welcome to, as long as you abide by the rules set in the Integration Guidelines. Please note that when talking with a web API, the requests will come from the origin sp://$APPNAME (so sp://tutorial for our example) - make sure the service you are talking to accepts requests from such an origin.
So, Im not sure if rails is set to not allow this sort of thing, or if its an issue with the putting the port into the required permissions, but my request
var req = new XMLHttpRequest();
req.open("GET", "http://127.0.0.1:3000/api/spotify/track/1.json", true);
console.log(req);
req.onreadystatechange = function() {
console.log(req.status);
console.log(req.readyState);
if (req.readyState == 4) {
if (req.status == 200) {
console.log("Search complete!");
console.log(req.responseText);
}
}
};
req.send();
Always returns status 0 where as their example:
var req = new XMLHttpRequest();
req.open("GET", "http://ws.audioscrobbler.com/2.0/?method=geo.getevents&location=" + city + "&api_key=YOUR_KEY_HERE", true);
req.onreadystatechange = function() {
console.log(req.status);
if (req.readyState == 4) {
console.log(req);
if (req.status == 200) {
console.log("Search complete!");
console.log(req.responseText);
}
}
};
req.send();
Will return a 403 response at least. its like the request is not being made or something?
Anyone have any idea what might be going on?
Much appreciated!
When talking to external services from a Spotify App, even if they're running on your local machine, you need to make sure that two things are in place correctly:
The URL (or at least the host) is in the RequiredPermissions section of your manifest. Port doesn't matter. http://127.0.0.1 should be fine for your case.
The server is allowing the sp://your-app-id origin for requests, as noted in the documentation you pasted in your question. This is done by setting the Access-Control-Allow-Origin header in your service's HTTP response. People often set it to Access-Control-Allow-Origin: * to allow anything to make requests to their service.
Thanks for help, I got it figured out, I think it was multiple things, with one main Im an idiot moment for not trying that earlier
First off, I had to run rails on port 80, as obviously if Im accessing my site from 127.0.0.1:3000, thats not going to work if spotify app is requesting 127.0.0.1 unless I can load that directly in the browser, which you cannot unless you run on 80. That is done via
rvmsudo rails server -p 80
Need to use rvmsudo because changing port requires permissions.
Next I had to set access controll allow origin as noted above, that can be done in rails 3 by adding before filter to your app controller as follows.
class ApplicationController < ActionController::Base
logger.info "I SEE REQUEST"
before_filter :cor
def cor
headers["Access-Control-Allow-Origin"] = "*"
headers["Access-Control-Allow-Methods"] = %w{GET POST PUT DELETE}.join(",")
headers["Access-Control-Allow-Headers"] = %w{Origin Accept Content-Type X-Requested-With X-CSRF-Token}.join(",")
head(:ok) if request.request_method == "OPTIONS"
end
end
Finally, and most importantly (sigh), you cant just righclick and reload your spotify app when you make changes to your manifest file, exit spotify completely and restart it!
I'm working on a web application and I went through the necessary steps to enable HTML5 App Cache for my initial login page. My goal is to cache all the images, css and js to improve the performance while online browsing, i'm not planning on offline browsing.
My initial page consist of a login form with only one input tag for entering the username and a submit button to process the information as a POST request. The submitted information is validated on the server and if there's a problem, the initial page is shown again (which is the scenario I'm currently testing)
I'm using the browser's developers tools for debugging and everything works fine for the initial request (GET request by typing the URL in the browser); the resources listed on the manifest file are properly cached, but when the same page is shown again as a result of a POST request I notice that all the elements (images, css, js) that were previously cached are being fetched form the server again.
Does this mean that HTML5 App Cache only works for GET requests?
Per http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html#the-application-cache-selection-algorithm it appears to me that only GET is allowed.
In modern browsers (which support offline HTML), GET requests can probably be made long enough to supply the necessary data to get back data you need, and POST requests are not supposed to be used for requests which are idempotent (non-changing). So, the application should probably be architected to allow GET requests if it is the kind of data which is useful offline and to inform the user that they will need to login in order to get the content sent to them for full offline use (and you could use offline events to inform them that they haven't yet gone through the necessary process).
I'm having exactly the same problem and I wrote a wrapper for POST ajax calls. The idea is when you try to POST it will first make a GET request to a simple ping.php and only if that is successful will it then request the POST.
Here is how it looks in a Backbone view:
var BaseView = Backbone.View.extend({
ajax: function(options){
var that = this,
originalPost = null;
// defaults
options.type = options.type || 'POST';
options.dataType = options.dataType || 'json';
if(!options.forcePost && options.type.toUpperCase()==='POST'){
originalPost = {
url: options.url,
data: options.data
};
options.type = 'GET';
options.url = 'ping.php';
options.data = null;
}
// wrap success
var success = options.success;
options.success = function(resp){
if(resp && resp._noNetwork){
if(options.offline){
options.offline();
}else{
alert('No network connection');
}
return;
}
if(originalPost){
options.url = originalPost.url;
options.data = originalPost.data;
options.type = 'POST';
options.success = success;
options.forcePost = true;
that.ajax(options);
}else{
if(success){
success(resp);
}
}
};
$.ajax(options);
}
});
var MyView = BaseView.extend({
myMethod: function(){
this.ajax({
url: 'register.php',
type: 'POST',
data: {
'username': 'sample',
'email': 'sample#sample.com'
},
success: function(){
alert('You registered :)')
},
offline: function(){
alert('Sorry, you can not register while offline :(');
}
});
}
});
Have something like this in your manifest:
NETWORK:
*
FALLBACK:
ping.php no-network.json
register.php no-network.json
The file ping.php is as simple as:
<?php die('{}') ?>
And no-network.json looks like this:
{"_noNetwork":true}
And there you go, before any POST it will first try a GET ping.php and call offline() if you are offline.
Hope this helps ;)