This is the documentation I've read on bearer tokens.
I'am looking to setup a bearer token with a expires_in value of 1000000(~11 days) but store and use this token in AWS Secrets Manager - https://aws.amazon.com/secrets-manager/
Right now the process isn't automated. Every 11 days, I have to go and manually update AWS Secrets Manager with the new token value. I'am looking to automate this and setup a callback. Whenever the token expires, I want to update AWS Secrets Manager via update secret with the a bearer token.
Does anyone have any references to any code samples I could use for this? I was thinking I could use AWS Lambda to make the actual calls to update the secret in Secrets Manager but I'am not sure how to write the trigger for the lambda
The function callable from the docs didn't have a parameter for specifying a callback function(for use when the token has expired)
Related
I am having trouble understanding how to accomplish this. I have Firebase functions running on my application. I am using an external API in which I can configure Webhooks to hit an endpoint on my Firebase functions to perform an action. To make sure that the call comes from this external API, they recommend using an oauth2 flow. Mainly they ask me for:
Provide us (the external API) with an ID and an access token;
these are used to access a URL which provides a bearer token;
this bearer token is then used to access the provided webhook URL until the bearer
token expires after a pre-determined period of time.
And there are 4 input fields:
1. OAuth2 access token url
2. OAuth2 client id
3. OAuth client secret
4. OAuth2 Scope. <---- NOT SURE WHAT THIS ONE MEANS
My question is how do I generate the access token and the client id for this external API?
What value should I put for the oAuth2 scope?
Thanks!
I was able to figure this out using auth0. In one of their documentations, they cleared explained what I was trying to accomplish. Posting here to future reference in case any one needs it.
Thanks all!
reference: https://auth0.com/docs/authorization/flows/client-credentials-flow#learn-more
You can generate the client ID and client secret in the Console > Credentials.
Cloud Functions API oAuth2 scope is https://www.googleapis.com/auth/cloud-platform.
I have a back-end processor, (imagine a chron job once a day generating reports), that needs to integrate with a third-party system. Their APIs only support the "Authorization code" grant type. The problem is I can't even fill out a request for a token as I don't have a redirect_uri (no website), and I definitely don't have a user of any kind. I'll just have the OAuth clientId and secret I provisioned via their developer portal, (Mashery), for my back-end report processor app.
I want to use the "Client credentials" grant type/flow since I'm just a back-end service.
Is there any way to fake this or hack it so my little back-end service can somehow work with authorization code flow?
Thanks in advance
No, there is no way to hack it. Client credentials only authenticate the client. A token issued for client credentials have no information about the user. If their API needs information about the user (you probably get information only about your user), then you need to have a token issued with Code Flow.
What you can do is to generate the OAuth token yourself. E.g. you can use oauth.tools to perform a Code Flow with their Authorization Server, or you can perform the flow from browser with a dummy redirect URI (e.g. http://localhost), the get the code returned from authorization request and perform a token request from curl.
Once you have an access and refresh token you can hard code them in your script (or read them from an env variable or file, etc). You can then call the API as long as the access token is valid, and use refresh token to get a new access token when it expires. You will not have to perform a new Code Flow for as long as the refresh token is valid.
I am working in a Power Automate solution which does read data from O365 via Graph API. As, the operation is running more than an hour, the bearer token gets expired.
I have implemented a logic to perform an REST call to regenerate the Bearer Token whenever it fails from the refresh token and ran the failed operation again in a DoUntil loop. But, as I have many calls performed via Graph API , I need to write the logic elsewhere in the Flow.
Pls do let me know whether there are any simple way to regenerate the Bearer Token from the Refresh Token.
Any help would be appreciated!!!
We would recommend start looking at the samples published by Microsoft. Microsoft recommends using MSAL library and MSAL library provides token caching and get fresh token when it's getting close to expiration. For more details on token caching please refer this documentation.
Store the token in a variable and use a parallel branch and get a new token in every 15 or 20 minutes. User the token variable in all API calls
I have an API that's protected by Auth0.
I want my users to have CLI access to this API as well. I was considering using the one-time password flow initially when the developer signs in to the CLI to request a refresh token, and then persist this on the disk for future use.
But this just feels wrong. Is there any other more secure approach?
The CLI might be used on build servers etc, so I guess it has to be a permanent token that lasts forever.
While we're at it, what do other APIs do, for instance GitHub when I request a Personal Access Token? Is that the same?
Did you check this? https://auth0.com/docs/flows/guides/device-auth/call-api-device-auth.
By using this flow on a CLI the user will login interactively through a browser and then will enter a code in the CLI. This will give you back an access token and a refresh token. You could use the refresh token in your CI process to obtain new tokens on every build (or whenever the AT expired).
I have a multiple microservices architecture in which I intend to apply security.
My View of the Security Design:
The authentication will happen with an LDAP and when the user is authenticated a JSON Web Token (JWT) will get generated using a "secret key" and the token will have the roles, expiration time etc. With every call to a microservice this token will be passed in Header for authorization. In my view I just have one single auth server which authenticates the user and generates the JWT.
My Doubt:
Now, when a microservice will receive a call (containing the JWT in Header) will it always hit the auth server to get the token verified?
If yes, won't it lead to multiple calls to auth server and thus a bad practice?
If no, how will the client verify the token and what is the scope of the auth server?
JWTs are always signed, so that you can verify a given token without doing a call to some central auth instance. The auth server knows the secret to sign the token and all services that want to validate the token also need have a way to check this.
There are two different approaches to signing:
symmetric: A secret value is appended before hashing the payload. The consuming service also needs to know this secret and can verify by appending the secret to the payload received and checking the resulting hash with the transmitted hash.
asymmetric: By using some PKI signing/verification it is possible that only the auth server has the private key to sign the token. All consuming services then only need the public part to verify.
I prefer the second way, cause it reduces the chance of a key getting stolen. If one of the consuming services is hijacked, there is no secret lost such that the attacker can create valid tokens. Using such an algorithm could possibly take some more time / cpu cycles for validating than a simple hash used in the symmetric way.
Please see the official JWT page for an example of the different mechanisms: https://jwt.io/