Central JWT management system for my micro-service based architecture - spring-security

We are building our applications in micro-services based architecture to implement our applications. As true with micro-services, we now see a lot of cross service interactions happening between services.
In order to safeguard the endpoints we plan to implement JWT based authentication between such secure exchanges.
There are 2 approaches we see helping us achieve it:
Embed an JWT engine in each application to generate the token (#consumer side) and evaluate (#provider side). With an initial exchange of keys, the token exchange shall work smooth for any future comms.
Have an external (to application) JWT engine, that sits in between all micro-service communications for the distributed application, and takes care of all token life cycle, including its encryption-decryption and validation.
There are lot of options to do it as per option #1 as listed on https://jwt.io but considering the over-head token generation and management adds to a micro-service, we prefer to go with 2nd option by having de-centralised gateway.
After quite some research and looking at various API gateways we have not yet come across a light weight solution/tool that can serve to our need and help us get centralised engine for one applications comprised of many micro-services.
Do anyone know about one such tool/solution?
If you have any other inputs on this approach, please let me know.

I prefer also option 2, but why are you looking for a framework?
The central application should only be responsible of managing the private key and issuing the tokens. Including a framework for solve one service could be excessive
You can also think to implement a validation service, but since applications are yours, I suggest to use an assymetric key and verify the token locally instead of executing remote validation requests to central application. You can provide a simple library to your microservices to download the key and perform the validation. Embed any of the libraries of JWT.io or build It from scratch. Validating a JWT is really simple
If you would need to reject a token before expiration time, for example using a blacklist, then It would be needed a central service. But I do not recommend this scheme because breaks JWT statelessness

Both scenarios could be implemented in Spring Cloud Zuul.
For more info:
http://cloud.spring.io/spring-cloud-static/Brixton.SR7/#_router_and_filter_zuul
http://cloud.spring.io/spring-cloud-static/Brixton.SR7/#_configuring_authentication_downstream_of_a_zuul_proxy

Related

Microservice architecture structure with docker-compose : .NET 6

An ex-employee planned a Microservice Architecture which is being implemented now. I've few question regarding the design and I'd highly appreciate your feedbacks.
Explanation
Dematerialized UI has a matching dematerialized API.
Dematerailized API validates the user and generates token via SSO Library.
Flight API does the I/O validation & validate the request via validate request microservice
Flight API calls Booking API to get some bookings based on the UserId
Flight API calls Print Booking API to generate Messages using Generate Message Microservice
Print Booking API must call Data Access API to get data and then call Generate PDF microservices.
Data Access API calls the database for data.
My Project Structure
FlightBookingsMicroserice.V1 //solution
ApiGatways //folder
DMZ.API/DMZ.API.csproj //Folder/project
BuildingBlocks
EventBus/EventBus.csproj
EventBus/EventBusRabbitMQ
Services
SSO
SSO.API/SSO.csproj
SSO.UnitTests
Flight
Flight.API/Flight.API.csproj
Flight.UnitTets
//Similar for all
ValidationRequest
Booking
PrintBooking
PrintBooking.API.csproj
DataAccess
DataAccess.API.csproj
GeneratePDF
GenerateMessage
UI
UI
Docker-compose
Questions
Should I be using ocelot in DMZ.API.csproj, Flight API and Print Booking API.
Is my project structure a Microservice way of development
Should I continue to use ASP.NET Core Web API with .NET 6 for Dematerialized API in orange, Function API in blue and Microservice in purple projects.
For validation, since the SSO is passed from Dematerialized UI what if the token expires while CRUD operations
is already performed for some stages [rolling back changes is a hassle].
Should each API access to an identidy server and validate the user passed and generate its own token for its
services in purple.
Thank you in advance.
The core question is if you really need all those services and if you perhaps are making things too complicated. I think the important thing is to really consider and really make sure you justify why you want to go through this route.
If you do synchronous API calls between the services, that creates coupling and in the long run a distributed monolith.
For question #4, you typically use one access token for the user to access the public service, and then you use a different set of internal tokens (machine-to-machine also called client credentials in OpenID Connect parlor) between services that have a totally different lifetime.
q1: ocelot is an API GATEWAY which is the entry point for your requests. so it should be the first layer/service meet by user request in front of your services and it forwards the request to the service according to its configuration. so it is lay in the front for all services you have. some arch provide another api gateway for different reasons like specific api gateway for mobiles request for example.
q2: as looking separate services (i cant understand function api but i assume they are services also ) yes but the microservices development is not just about separating things, its about design and identifying the services from business context (Domain Driven Design).its very challenging to identify services and their size and the way they are communicate to each other (asynchronous communication and synchronous communication).
q3: microservices is not about languages and frameworks.one of benefits of microservices architecture is its not language or framework dependent. the may be multiple languages used in microservices. choosing languages it depends on organization policy or your own reasons. if you are .net developer then go for .net.
q4: all the services are registered with identity server and they validate the given token by it. the identity server generate token (there may be multiple tokens) with scopes . the request from identified users always has the token in the headers and the services validate incoming token by referring identity server. this tokens has lifetime and also identity server generates refresh tokens in case of expiry of current token. please look at Oauth docs and rfc. also this https://www.youtube.com/watch?v=Fhfvbl_KbWo&list=PLOeFnOV9YBa7dnrjpOG6lMpcyd7Wn7E8V may helped. you can skip the basic topics. i learned a lot from this series.

Can I have two clients (web app + native mobile app) with one client info in IdentityServer4

I have two client applications : Asp.Net MVC Core web app and an Android native mobile app, and an IdentityServer4 Server as an OpenID Server`.
I know that I have to create two client records for both of them (in the IS4's Clients table) :
a Hybrid Flow for the web app
a Hybrid/Authorization Code + PKCE for the native mobile app
But I'm wondering if I can create only one Client info for both of them or not?
I think you should create one client definition for each client, so you better can separate them and better evolve them as needed over time. Also makes it easier to separate them in the logs for example.
A question is however why/if you really need to support the hybrid flow? I think both clients only need to use the authorization code flow.
If you want to follow OAuth 2.1, then there are only two flows to use, either authorization code flow or client credentials flow. All other flows are not recommended due to various security issues. See https://oauth.net/2.1/
Tore's answer gives good reasons to keep those clients separated. If you're still not convinced, I would turn the question around - why do you want to have one logical client data used by two separate clients? This will cause some weird issues in the future. For example, at some point you might want to rate-limit one of those clients, or change client authentication method, or even block one of them completely. You will have to do it for both your apps if you don't create a separate client.
From the security point of view, there is a good reason to keep those two separated: the web client can be a confidential client with a secret assigned. The mobile client will be a public client, without a secret*. This is a solid reason not to mix those two, as you will lower the strength of your security considerably.
*In fact, best practice would be to use DCR and register a new client for each device where your app runs.

New single page app needs to authenticate to legacy app using Shibboleth

I am creating a new React SPA. Users of a legacy app need be able to use the new app without re-authenticating. So I need to support SSO.
It's important to note that it is also required that users of other (currently unspecified) apps should also be able to use the new app without re-authenticating, so whatever approach I take needs to be sufficiently decoupled to potentially allow this.
The legacy app supports authentication via Shibboleth, the new app currently has no authentication method, but uses JWT for authorisation.
I'm wondering if anyone has any experience of such a scenario? It seems to me that I probably need to be create an OAuth2 authorisation server for the new app to talk to and I need to somehow bring Shibboleth into the mix for the authentication, maybe with the authorisation service acting as a Shibboleth Service Provider. Googling around hasn't revealed much useful info.
Is what I've described along the right lines? I know it's very high level and woolly, but I'm really not sure of the approach to take. Any advice, information or experience in this area would be gratefully received!
GOALS
It's a little bit of a subjective question, but the main goals are usually as follows:
Focus on building your UI and API security in a future facing manner
Also provide good Login Usability
Also deliver on non functional requirements such as availability / reliability
AUTHORIZATION SERVER
On the first point, the modern option is to integrate UIs and APIs with an Authorization Server - perhaps as in My Tutorial. Your architecture is then good, but the migration is not trivial.
FEDERATING TO SHIBBOLETH
The Authorization Server can then redirect to Shibboleth and talk SAML2.0 to achieve Single Sign On, as you suggest. It is a complex solution though, and may be a backwards step in some ways.
AVAILABILITY
This is usually a big concern, and most companies use a cloud provider such as Azure / AWS due to its high availability / low maintenance / low cost. Would this be a better option for you?

Communication between two microservices in JHipster using JWT

I'm building a small microservice-based webapp using JHipster with JWT authorization. The Architecture is simple, one gateway and two services with repositories. The problem that I had for the last few hours is the communication between the two backend-services.
At first, I tried to find a token on the services themself, but couldn't find it. If I just missed it in all the docs (quite overwhelming when beginning with the full stack :P), I would be happy to revert my changes and use the predefined token.
My second approach was that each service will authorize itself with the gateway at PostConstruct and save the token in memory to use it each API call. It works without a problem, but I find it hard to believe that this functionality is not already programmed in JHipster.
So my question is whether my approach is usual? If neither is true and there are some best-practices for it, I'm also interested in them.
It depends on the use case.
For user requests, a common approach is: the calling service forwards the token it received to the other service without going through the gateway suing #AuthorizedFeignClient.
For background tasks like scheduled jobs, your approach can be applied or you could also issue long life tokens as long as they have limited permissions through roles. This way you don't have to go through gateway.
Keycloak's offline tokens approach could also inspire you.

oAuth implementation from the beginning or later

I'm starting a new system creating using .NET MVC - which is a relatively large scale business management platform. There's some indication that we'll open the platform to public once it is released and pass the market test.
We will be using ExtJs for the front-end which leads us to implement most data mining work return in JSON format - this makes me think whether I should learn the OAuth right now and try to embed the OAuth concept right from the beginning?
Basically the platform we want to create will initially fully implemented internally with a widget system; our boss is thinking to learn from Twitter to build just a core database and spread out all different features into other modules that can be integrated into the platform. To secure that in the beginning I proposed intranet implementation which is safer without much authentication required; however they think it will be once-for-all efforts if we can get a good implementation like OAuth into the platform as we start? (We are team of 6 and none of us know much about OAuth in fact!)
I don't know much about OAuth, so if it's worth to implement at the beginning of our system, I'll have to take a look and have my vote next week for OAuth in our meeting. This may effect how we gonna implement the whole web service thing, so may I ask anyone who's done large-scale web service /application before give some thoughts and advice for me?
Thanks.
OAuth 1 is nice if you want to use HTTP connections. If you can simply enforce HTTPS connections for all users, you might want to use OAuth 2, which is hardly more than a shared token between the client and server that's sent for each single request, plus a pre-defined way to get permission from the user via a web interface.
If you have to accept plain HTTP as well, OAuth 1 is really nice. It protects against replay attacks, packet injection or modification, uses a shared secret instead of shared token, etc. It is, however, a bit harder to implement than OAuth 2.
OAuth 2 is mostly about how to exchange username/password combinations for an access token, while OAuth 1 is mostly about how make semi-secure requests to a server over an unencrypted connection. If you don't need any of that, don't use OAuth. In many cases, Basic HTTP Authentication via HTTPS will do just fine.
OAuth is a standard for authentication and authorization. You can read about it in many places and learn; Generally the standard lets a client register in the authentication server, and then whenever this client attempts to access a protected resource, he is directed to the auth-server to get a token (first he gets a code, then he exchanges it with a token). But this is only generally, there are tons of details and options here...
Basically, one needs a good reason to use oAuth. If a simpler authentication mechanism is good for you - go for it.

Resources