Cross domain SignalR Authentication - asp.net-mvc

I have a working .NET website with authentication up and running (Identity 2.0 via Owin & https://identityazuretable.codeplex.com/)
I would like to use websockets primarily for the SignalR transport, but because the domain is running on cloudflare (the free plan does not currently support websockets) I cannot use SignalR on the same domain and get websockets. So to work around this issue I have a subdomain websockets.example.com, which does not use cloudflare, but does point to the same application.
However, I now wish to authenticate the SignalR connections based on their forms authentication token in a cookie. However the cookie does not get sent when I do the below, or when SignalR connects to websockets.example.com
JQuery Request:
$.ajax({
url: "//websockets.example.com/signalr/hubs",
type: "POST",
xhrFields: {
withCredentials: true
}
});
Or:
$.connection.hub.url = '//websockets.example.com/signalr/';
$.connection.hub.start({ withCredentials: true });
Headers:
Accept:*/*
Accept-Encoding:gzip,deflate
Accept-Language:en-GB,en-US;q=0.8,en;q=0.6
Connection:keep-alive
Content-Length:0
Cookie:ARRAffinity=805c328533b85a33c6fdeb4870bd41f00e05fd898b5d784f2635656f9525466b
Host:websockets.example.com
Origin:http://example.com
Referer:http://example.com/Page
Response:
Access-Control-Allow-Credentials:true
Access-Control-Allow-Origin:https://example.com
EDIT: Owin Config:
app.Map("/signalr", map =>
{
map.UseCors(new CorsOptions
{
PolicyProvider = new CorsPolicyProvider
{
PolicyResolver = context =>
{
var corsPolicy = new CorsPolicy
{
AllowAnyHeader = true,
AllowAnyMethod = true,
SupportsCredentials = true,
AllowAnyOrigin = false,
};
corsPolicy.Origins.Add("http://example.com");
corsPolicy.Origins.Add("http://www.example.com");
corsPolicy.Origins.Add("http://websockets.example.com");
corsPolicy.Origins.Add("https://websockets.example.com");
corsPolicy.Origins.Add("https://example.com");
corsPolicy.Origins.Add("https://www.example.com");
return Task.FromResult(corsPolicy);
}
}
});
map.RunSignalR();
});

I think your issue is with the way your authentication cookie is set. This would explain why the cookie isn't sent to websockets.example.com via SignalR or a normal CORS request made via jQuery.ajax.
To ensure cookies set by a parent domain are sent to submdomains you need to explicitly define the domain with the Set-Cookie header:
Set-Cookie: name=value; domain=.mydomain.com
If you don't explicitly define the domain, cookies will only be sent back to the exact domain that set them.
http://erik.io/blog/2014/03/04/definitive-guide-to-cookie-domains/
https://stackoverflow.com/a/23086139/719967

Related

Proper way to proxy all calls to an external API through the Next.js server? (Make SSR components work with client auth cookies)

I have a rails API running in the same cluster as my Next.js13 server. The rails API uses auth cookies to track the session.
I can log into a client side component and start making authenticated API calls based on the set-cookie header I receive from the rails API, however, when using an SSR component e.g....
export default async function MeTestPage () {
try {
let allCookies = cookies().getAll().map(c => `${c.name}=${c.value}`).join("; ");
console.log(allCookies);
let result = await fetch("http://0.0.0.0:3000/users/me", {
"headers": {
"accept": "*/*",
"accept-language": "en-US,en;q=0.9",
"sec-ch-ua": "\"Google Chrome\";v=\"107\", \"Chromium\";v=\"107\", \"Not=A?Brand\";v=\"24\"",
"sec-ch-ua-mobile": "?0",
"sec-ch-ua-platform": "\"macOS\"",
"sec-fetch-dest": "empty",
"sec-fetch-mode": "cors",
"sec-fetch-site": "same-origin",
"cookie": allCookies
},
"referrerPolicy": "strict-origin-when-cross-origin",
"body": null,
"method": "GET",
"mode": "cors",
"credentials": "include"
});
let resultJson = await result.json();
return <p>{JSON.stringify(resultJson)}</p>
} catch (e: any) {
return <p>{e.toString()}</p>
}
The request goes through, rails gets the right cookies, but rails doesn't connect it to the session, I suspect because it's coming from a different IP address, though I haven't been able to figure this out.
I feel like one good solution would be to just proxy all client-side requests through the next server so that the next server can just act as the sole API client for rails and keep the IP consistent, but I'm not actually sure what the best way to do that is. I've tried both setting rewrites in next.config.js and also just copying the request method/route/headers/body to a new request from an /api/[...path].ts defined endpoint (but had a very frustrating time debugging why this wasn't sending the body).
I'm just getting into next.js and can't believe this is such a struggle-- I figure there must be some canonical way of handling this very common need to access a cookies protected API from both environments.

Reproducing an ADAL.JS-authenticated request in Postman

I have a .NET Web API and a small vanilla-JS app using ADAL.js, and I've managed to make them talk nicely to each-other and authenticate correctly.
If I console.log the token returned from adalAuthContext.acquireToken() and manually enter it as Authorization: Bearer {{token}} in Postman, I can also get a valid, authenticated, response from my backend.
However, I can't figure out how to configure Postman's built-in OAuth2.0 authentication UI to get me tokens automatically. I have managed to get tokens in several ways, but none of them are accepted by the backend.
How do I configure Postman to get a token the same way the ADAL.js library does?
For completeness, here's some code:
Backend configuration:
public void Configuration(IAppBuilder app)
{
app.UseCors(CorsOptions.AllowAll);
app.UseWindowsAzureActiveDirectoryBearerAuthentication(
new WindowsAzureActiveDirectoryBearerAuthenticationOptions
{
TokenValidationParameters = new TokenValidationParameters { ValidAudience = "<app-id>" },
Tenant = "<tenant>",
AuthenticationType = "WebAPI"
});
var config = new HttpConfiguration();
config.MapHttpAttributeRoutes();
app.UseWebApi(config);
}
ADAL.js configuration:
const backendUrl = 'http://localhost:55476';
const backendAppId = '<app-id>';
const authContext = new AuthenticationContext({
clientId: backendAppId,
tenant: '<tenant>',
endpoints: [{ [backendAppId]: backendAppId }],
cacheLocation: 'localStorage'
});
Actually making a request:
authContext.acquireToken(backendAppId, (error, token) => {
// error handling etc omitted
fetch(backendUrl, { headers: { Authorization: `Bearer ${token}` } })
.then(response => response.json())
.then(console.log)
})
So since the Azure AD v1 endpoint is not fully standards-compliant, we have to do things in a slightly weird way.
In Postman:
Select OAuth 2.0 under Authorization
Click Get new access token
Select Implicit for Grant Type
Enter your app's reply URL as the Callback URL
Enter an authorization URL similar to this: https://login.microsoftonline.com/yourtenant.onmicrosoft.com/oauth2/authorize?resource=https%3A%2F%2Fgraph.microsoft.com
Enter your app's application id/client id as the Client Id
Leave the Scope and State empty
Click Request token
If you configured it correctly, you'll get a token and Postman will configure the authorization header for you.
Now about that authorization URL.
Make sure you specify either your AAD tenant id or a verified domain name instead of yourtenant.onmicrosoft.com.
Or you can use common if your app is multi-tenant.
The resource is the most important parameter (and non-standards-compliant).
It tells AAD what API you want an access token for.
In this case I requested a token for MS Graph API, which has a resource URI of https://graph.microsoft.com.
For your own APIs, you can use either their client id or App ID URI.
Here is a screenshot of my settings:

401 Error getting a list of service endpoints using REST API from TFS extension

I have developed a TFS extension for TFS 2017 on premises.
I need to get a list of the service endpoint within a project
I am using the following code inside a TFS extension (code-hub)
private callTfsApi() {
const vsoContext = VSS.getWebContext();
let requestUrl = vsoContext.host.uri
+ vsoContext.project.id
+ "/_apis/distributedtask/serviceendpoints?api-version=3.0-preview.1";
return VSS.getAccessToken().then(function (token) {
// Format the auth header
const authHeader = VSS_Auth_Service.authTokenManager.getAuthorizationHeader(token);
// Add authHeader as an Authorization header to your request
return $.ajax({
url: requestUrl,
type: "GET",
dataType: "json",
headers: {
"Authorization": authHeader
}
}).then((response: Array<any>) => {
console.log(response);
});
});
}
On every request the server responds with a status of 401 (Unauthorized).
If I use postman and basic authentication the call to the service endpoints APIs works.
Also, using the same code but a different API call (projects) works.
let requestUrl = vsoContext.host.uri + "_apis/projects?api-version=1.0";
Is there some sort of known bug related to the service endpoints APIs or maybe the extension must specify a scope? (not sure which one to include though)
Service endpoints are created at project scope. If you could query project info, you should also be able to query this.
You could try to add related scope vso.project in https://learn.microsoft.com/en-us/vsts/extend/develop/manifest#scopes page see if this do the trick.
Another way to narrow down this issue is directly using Rest API to call from code (not inside a TFS extension ) to see if the issue is related to extension side.
Add scope: vso.serviceendpoint_query

Passing Authentication from WebApp to WebAPI using BreezeJS

I am having two web applications, one a SPA using AngularJS + BreezeJS and the other a WebAPI. We are building authorization in the WebAPI and the results get filtered based on user access. We want the user to sign-in into organization Azure AD in the SPA and pass the same authentication to WebAPI.
I am using ADAL JS library for authentication in SPA and have successfully handled that. However, I am not able to pass the same authentication to WebAPI using BreezeJS. Our WebAPI is OData v3 and without authn, Breeze works fine. We have customized the defaultHttpClient to add customer headers for DataVersion and MaxDataVersion since DataJS needs it.
var oldClient = OData.defaultHttpClient;
var myClient = {
request: function (request, success, error) {
request.headers.DataServiceVersion = '3.0';
request.headers.MaxDataServiceVersion = '4.0';
return oldClient.request(request, success, error);
}
};
OData.defaultHttpClient = myClient;
However, I am not sure how to pass authentication token.
I have done following in entityManager
var ajaxAdapter = breeze.config.getAdapterInstance("ajax");
ajaxAdapter.defaultSettings = {
xhrFields: {
withCredentials: true
}
};
as per the comments by Ward Bell on one of the posts by John Papa. However, this does not seem to be working. Need help.
Thanks
Hemant
After some tinkering with the HTTP requests, I found out that the Bearer token that we were expecting to be passed on to server was actually not happening. Reason being we were not using ajaxAdapter in breeze. We had to add that header ourselves and send the request. We had setup an application in Azure AD. We had to pickup the key for the client application to get the token from storage. This prefixed with "Bearer " did the trick. Here is the sample code for custom adapter:
var oldClient = OData.defaultHttpClient;
var myClient = {
request: function (request, success, error) {
request.headers.DataServiceVersion = '3.0';
request.headers.MaxDataServiceVersion = '3.0';
request.headers.Authorization = "Bearer " + adalAuthenticationService.getCachedToken('<<your AD client app key here>>');
return oldClient.request(request, success, error);
}
};
OData.defaultHttpClient = myClient;

Posting to Yii PHP framework using Backbone.js

I am trying to use Backbone.js models to save to my Yii web application but I am getting a "The CSRF token could not be verified" response even when the model is a serialized form and I use Backbone.sync to set a header.
The model (the form has the CSRF token in it and sends it as a "YII_CSRF_TOKEN" attribute):
var v = new ModelName ($('.formclass').serializeJSON());
JSON serializer:
//form.serializeJSON
(function( $ ){
$.fn.serializeJSON=function() {
var json = {};
jQuery.map($(this).serializeArray(), function(n, i){
json[n['name']] = n['value'];
});
return json;
};
})( jQuery );
The backbone.sync:
Backbone.old_sync = Backbone.sync;
Backbone.sync = function(method, model, options) {
var new_options = _.extend({
beforeSend: function(xhr) {
console.log('backbone sync');
var token = model.get('X_CSRF_TOKEN');
console.log('token ='+token)
if (token) xhr.setRequestHeader('YII_CSRF_TOKEN', token);
}
}, options)
Backbone.old_sync(method, model, new_options);
};
I have also tried setting the header as 'X_CSRF_TOKEN', to no avail.
YII_CSRF_TOKEN is not a header, it is just a form value.
According to this line our request have to contain
a CSRF cookie, it is already set by first, non-XHR page load
the form data value named YII_CSRF_TOKEN
If you send your data with save() you must send cookies and session id in parameters. See here a cached version of this blog post (cuz its offline now): http://webcache.googleusercontent.com/search?q=cache:tML1kmL08ikJ:blog.opperator.com/post/15671431847/backbone-js-sessions-and-authentication+&cd=1&hl=en&ct=clnk
If you're working on localhost, you might need to setup a Virtual Host to be able to perform cookie authentication as stated in this thread:this thread
IE and Chrome does not accept cookies from localhost so that could be the reason

Resources