I have VueJS frontend talking to a Rails backend. I can successfully create 1 a session and receive the CSRF token to store, but it's never sent 2 in subsequent Axios requests when deployed on Heroku. It works fine on localhost, and both the frontend + backend are hosted on herokuapp.com so SameSite shouldn't be an issue.
Axios is set to include the XSRF token with requests:
const API_URL = process.env.VUE_APP_API_URL || "http://localhost:8080";
axios.defaults.baseURL = API_URL;
axios.defaults.headers.post["Content-Type"] = "application/json";
axios.defaults.xsrfCookieName = "CSRF-TOKEN";
axios.defaults.xsrfHeaderName = "X-CSRF-Token";
axios.defaults.withCredentials = true;
How can I ensure Axios passes this cookie to my Rails Application? Images below: cookie returned on session creation and then not included in the following request.
[1
This ended up being an issue with .herokuapp.com being on the public suffix list. Adding a custom domain solved the issue.
https://devcenter.heroku.com/articles/cookies-and-herokuapp-com
Related
I am using session based authentication in my Angular Universal app. Problem is when http request is made from Angular app, backend (node.js) doesn't access the ongoing session, but creates new. You might think this is because cors, but the thing is, the first initial load only doesn't access session. So when I open up my app on page that has resolver or guard, that is making http request. That http request is going to create new session. Then navigating to other pages in app, it all works. http requests made after initial load will access the session. If I start from page that has no resolver/guards and then navigate to page that has and makes http request, this request will access to session.
Here is how my session is setup in index.js:
var sessionStore = new MySQLStore(options);
app.use(
session({
key: 'sessionStorage',
store: sessionStore,
secret: config.get('demoSess'),
saveUninitialized: false,
resave: false,
name: 'demo',
cookie: {
maxAge: 60000,
secure: false
},
})
)
const cors = require('cors');
app.use(cors({
origin: [
'http://localhost:4200'
],
credentials: true
})
);
And this how http request is made from frontend:
this.http.get(environment.apiUrl+'/server/page/auth', {withCredentials: true});
Is this how it should be? Backend runs on port 8080 and frontend 4200.
In app.module.ts, I have written TransferHttpCacheModule. If I remove it, I can see from backend, when I console log something, that first http request is made twice- first one doesn't access session and then second one does. So if I was to console.log(req.session.userId) in /server/page/auth, I would get undefined and 1 on next line. As I read, something like this was normal and to get around it, transferstate comes to into play, but as I understand TransferHttpCacheModule is basically easy way to do the transferstate. I tried also with writting the transferstate into resolver and outcomes was same- one request is only made, but that request wont access session.
I am hoping I am missing something when I am making http request from frontend or my session/cors is missing something. At this point I am running out of idea what to check or test, any hint what to check out is welcoming.
So I started to build around my authentication in Angular to use localStorage. I ran there into problem and while searching for solution I ran into tutorial talking something about isPlatformBrowser. So I started thinking, maybe Angular Universal in some way is making two request, but these two request are different and I need to eliminate one of them. So I ended up wrapping my http request with if(isPlatformBrowser(this.platformId)) { } and so far it seems I got my problem fixed.
I have a Rails API with a React client side. I have had everything in the app setup for a long time now and today while I was working on it I suddenly started getting the error:
Access to XMLHttpRequest at 'http://localhost:3000/api/v1/user/authed'
from origin 'http://localhost:8000' has been blocked by CORS policy:
The value of the 'Access-Control-Allow-Origin' header in the response
must not be the wildcard '*' when the request's credentials mode is
'include'. The credentials mode of requests initiated by the
XMLHttpRequest is controlled by the withCredentials attribute.
Now none of the requests in my application work at all.
The request does go through from the React app to the Rails API and the Rails API responds properly as well (I can see this in the terminal) but nothing actually happens on the Client side because I am assuming it gets blocked for the CORS reason.
Is there something I can do to fix this? Could it be that some package is somehow updated on my system and different from the project so now it breaks?
URL to make request to:
const ENDPOINT = '/api/v1',
PORT = 3000,
URL = window.location.protocol + '//' + window.location.hostname + ':' + PORT + ENDPOINT;
The request
$.ajax({
url: URL + '/' + resource,
type: verb,
data: params,
xhrFields: { withCredentials: true }
})
.done(callback)
.fail(errcallback);
Request functions have the format:
static get(resource, params, callback, errcallback) {
API.send('GET', resource, params, callback, errcallback);
}
If your API doesn't require credentials you should remove withCredentials: true.
More about withCredentials:
The XMLHttpRequest.withCredentials property is a Boolean that indicates whether or not cross-site Access-Control requests should be made using credentials such as cookies, authorization headers or TLS client certificates. Setting withCredentials has no effect on same-site requests.
https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/withCredentials
I have a spring backend which i'm accessing my Elastic search cluster through by a proxylike endpoint. The request has to be authorized with a cookie.
I'm currently using searchkit with supports authenticating requests through the withCredentials flag. Is there a similar option for reactivesearch or is there any other solution for authorizing the request with a cookie?
I could add: the backend exposes a swagger client which runs on a different domain than my frontend client. This client "owns" the cookie and thus i cannot read the cookie from my frontend client
You can use the headers prop in ReactiveBase to pass custom headers along with the requests. Link to docs. Since there is no withCredentials you could read the cookies and set in custom headers to verify the requests at the proxy middleware.
<ReactiveBase
...
headers={{
customheader: 'abcxyz'
}}
>
<Component1 .. />
<Component2 .. />
</ReactiveBase>
Here is a sample proxy server but its in NodeJS
Okey so it turns out, Reactivesearch uses fetch and fetch wants credentials: 'include' for cookie authentication. This may not be placed in the headers that Reactivesearch supplies and must be placed on the root object for the request.
It's possible to do this by implementing beforeSend on ReactiveBase.
const Base = ({ children }) => {
const onBeforeSend = props => {
return {
...props,
credentials: 'include',
}
}
return (
<ReactiveBase
app="app-name"
url="url"
beforeSend={onBeforeSend}
>
{children}
</ReactiveBase>
)
}
I have a use-case where I need to spoof a white-listed Redirect URL locally when performing OAuth 2 authentication.
I'm running a very basic web-server coupled with a hosts file entry for the domain I'm spoofing. I'm able to correctly negotiate my tokens and return them to Paw, but Paw isn't picking up my access_token or refresh_token, it simply displays the raw response:
Here's my server code (with placeholders for sensitive data):
var http = require('http'),
request = require('request');
var PORT = 6109;
var server = http.createServer(function(req, res) {
var code = req.url.split('?')[1].split('=')[2];
request({
url: 'https://<access token URL>/oauth2/token?code=' + code,
method: 'POST',
form: {
'client_id': <client_id>,
'client_secret': <client_secret>,
'grant_type': 'authorization_code',
'redirect_uri': <spoofed redirect URL>
}
}, function(err, response, data) {
data = JSON.parse(data);
res.writeHead(200, {'Content-Type': 'application/json'});
res.write(JSON.stringify(data.result));
// I also tried this with the same end-result
// res.writeHead(200);
// res.write('access_token=' + data.result.access_token + '&token_type=' + data.result.token_type + '&refresh_token=' + data.result.refresh_token);
res.end();
});
});
server.listen(PORT, function() {
console.log('Server listening on port %d', PORT);
});
What am I missing? Why isn't Paw finding my tokens?
Here's my configuration for reference:
Some other noteworthy points:
The OAuth provider is non-standard and flubs quite a few things from the spec (my proxy exists in part to patch up the non-standard bits)
The domain for the Redirect URL is real, but the URL does not resolve (this is a part of the reason for the local hosts entry)
I'm not showing this part of the flow, but I am correctly completing the authorization step prior to being given the code value
I think you're probably confused between the Authorization URL and Access Token URL. When you're in Authorization Code grant type for OAuth 2, you're expected to have a user confirmation step in a web page (the Authorization URL).
Which makes me guess that instead, you're expecting instead to use the Password Grant or Client Credentials? Otherwise, if you want to use Authorization URL, you'll need to specify a webpage at the Authorization URL.
Note: I've tried your Node.js script in Paw using the two last grants I mentioned (Password Grant & Client Credentials), and it works nicely.
Update: Following the comments below, I understand more what you are doing. The Authorization Request should (if successful) return a 302 redirect response to the Redirect URL page, and append a code URL query param to it. It seems like you're returning a JSON response with the code instead, so Paw isn't catching it.
According to the OAuth 2.0 spec (RFC 6749), section *4.1.2. Authorization Response*, if granted, the code should be passed as a URL query param (i.e. a ?key=value param in the URL) to the Redirect URL when doing the redirection.
If the resource owner grants the access request, the authorization
server issues an authorization code and delivers it to the client by
adding the following parameters to the query component of the
redirection URI using the "application/x-www-form-urlencoded" format
Quoting the example from the spec, here's how the response of the Authorization Request should look like if it's a success (code is granted):
HTTP/1.1 302 Found
Location: https://client.example.com/cb?code=SplxlOBeZQQYbYS6WxSbIA
&state=xyz
I saw that the Redirect URL contains "my Spoofed Uri".
When we need to use authorization code flow, we provide the authorization code and redirect Uri.
When the URI you are providing does not match the URI saved for the client in Identity server, you will not be able to get the token as the URI does not match with the client authorization code.
For example : Consider client identity in the Identity server be:
Auth Code: "xyx"
Redirect Uri: "www.mylocalhost.com\xyz"
And in your example the combination you are providing is:
Auth Code: "xyx"
Redirect Uri: "<my spoofed uri>"
As these 2 wont match there will be no token received.
I believe if you use the correct URI that is registered with the client in the Identity server, you will be able to receive the token.
I have three ASP.NET WebAPI endpoints:
Identity server, which generates bearer tokens (serverA.com);
SignalR server with hub (serverB.com);
Some endpoint with a simple ASP.NET MVC view and SignalR JS client script (serverC.com).
All three servers use OAuth2 middleware as Auth mechanism. Microsoft.Owin.Cors is configured as well.
Servers use only HTTPS requests.
SignalR v2.2.0 is installed on serverB.com.
I can successfully make a cross-domain request from serverC.com to serverA.com to get bearer token, but, actually, I don't know how to pass auth token while connecting to serverB.com
There are two ways I found so far:
Pass auth token as a query string (not secure);
Apply this setting to the jQuery.ajax
$.ajaxSetup({
beforeSend: function (xhr) {
xhr.setRequestHeader('tokenKey', 'tokenValue');
}});
but it forces SignalR to use long polling only.
Is there any other way to send auth token (not in query string) so it can be consumed and validated by OAauth BearerAuthorizationProvider? Maybe, cookies, headers or any other way?
Update
CORS middleware set up for both environments to allow all data and accept credentials.
Here is my OWIN middleware:
var requestUri = context.Request.Uri.AbsolutePath;
if (string.Equals(requestUri, authRoute, StringComparison.OrdinalIgnoreCase))
{
if (!context.Request.Headers.ContainsKey("Authorization") || string.IsNullOrEmpty(context.Request.Headers["Authorization"]))
{
context.Response.StatusCode = (int)HttpStatusCode.Unauthorized;
}
else
{
context.Response.Cookies.Append("BearerToken", context.Request.Headers["Authorization"]);
}
}
else
{
await Next.Invoke(context);
}
Then I do first Ajax request:
$.ajax({
url: self.communicationHubUrl + '/authenticate',
type: 'post',
cache: false,
crossDomain: true,
beforeSend: function(xhr) {
xhr.setRequestHeader('Authorization', self.accessToken);
});
middleware sets auth token from header to the response cookies.
Then I call hub.Start so SignalR begins to send ajax requests.
But for some reasons I don't quite understand, cookies are present in request only if I enable xhr.withCredentials = true for ALL ajax requests via $.ajaxSetup
$.ajaxSetup({
xhrFields: {
withCredentials: true
}
});
Without this setting request doesn't include cookies. On the other hand, I guess it's not a good decision to force all ajax requests to enable such settings.
Furthermore, I've faced strange behavior in Oauth middleware: ValidateIdentity method is not invoked when request from signalR comes so instead of 401 Unauthorized, I'm getting deafult principal.
I think that putting the auth token insied a cookie will be your best bet. Unlike the ajaxSetup option, cookies are sent with EventSource and WebSocket requests.
You might need to add some middleware to the SignalR server (serverB.com) that sets the appropriate cookie when you POST the auth token before starting the SignalR connection.