Getting oAuth Token using MSAL PublicClientApplication acquire_token_interactive method from Databricks is not working : InteractiveBrowserCredential - oauth-2.0

I am trying to get oAuth2.0 token to the protected resource using InteractiveBrowserCredential flow.
This is working from my local jupyter notebook however when i am trying to run it from Databricks notebook, it is unable to open browser (as Databricks cluster has no browser installed) and giving me below message
Found no browser in current environment. If this program is being run inside a container which has access to host network (i.e. started by `docker run --net=host -it ...`), you can use browser on host to visit the following link. Otherwise, this auth attempt would either timeout (current timeout setting is None) or be aborted by CTRL+C. Auth URI: https://login.microsoftonline.com/{tenant_id}/oauth2/v2.0/authorize?client_id={client_id}&response_type=code&redirect_uri=http%3A%2F%2Flocalhost%3A44093&scope={resource_id}%2Fuser_impersonation+offline_access+openid+profile&state=EvgdkFcNZTuJG&code_challenge=KR8zwfjhkuKYTGSlbaYAJNLVjXZHiE&code_challenge_method=S256&nonce=33a1a12813342535455f398GHATf9c2cf21a8&client_info=1
I am trying to find out if there a way i can make it work, (by somehow using public redirect_uri to the Databricks cluster and driver node or in similar way). I can alternatively use device_code flow (it is working) however i want to see if i can by-pass one extra step of entering device code and directly authenticate using browser.
Please find the sample code i am using now below
import msal
app = msal.PublicClientApplication(self.CLIENT_ID, authority=self.AUTHORITY,token_cache= msal.TokenCache())
result = app.acquire_token_interactive(scopes=self.SCOPE)

Related

How i can authenticate the Google Cloud Video Intelligence API in a Golang Docker Container running on a GoogleVirtual Machine using a serviceAccount?

I'm trying to make a request in Go client.AnnotateVideo(ctx, &annotateVideoRequest) to the Google Cloud Video Intelligence API using the package cloud.google.com/go/videointelligence/apiv1.
I noticed that if I'm on a Google VM, i don't need any credentials or environment variable because the API says:
For API packages whose import path is starting with "cloud.google.com/go",
such as cloud.google.com/go/storage in this case, if there are no credentials
provided, the client library will look for credentials in the environment.
But I guess I can't authenticate because I'm running a Docker Container inside the Google VM, and I don't know if I really need a credentials file in that docker container, because I don't know if the library automatically creates a credentials file, or it just check if there is a $GOOGLE_APPLICATION_CREDENTIALS and then use that (But that makes no sense. I'm on a GOOGLE VM, and I'm supposed to have that permission).
The error is:
PermissionDenied: The caller does not have permissions
Some links that might be helpful:
https://pkg.go.dev/cloud.google.com/go/storage
https://cloud.google.com/docs/authentication#environment-service-accounts
https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-go
https://cloud.google.com/video-intelligence/docs/common/auth#adc
Thanks in advance!

Authorizing client libraries without access to a web browser - gcloud auth application-default login

When I use to run either command:
gcloud auth application-default login
OR for a specific docker container
docker exec -it 822c4c491383 /home/astro/google-cloud-sdk/bin/gcloud auth application-default login.
My command line would give me a link to a google response page where I'd copy the code they gave me and write it in the command line.
For some reason now, whenever I try to do either command I'm getting the follow error, saying I don't have access to web browser.
You are authorizing client libraries without access to a web browser. Please run the following command on a machine with a web browser and
copy its output back here. Make sure the installed gcloud version is
372.0.0 or newer.
gcloud auth application-default login --remote-bootstrap="https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=764086051850-6qr4p6gpi6hn506pt8ejuq83di341hur.apps.googleusercontent.com&scope=openid+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Faccounts.reauth&state=FmMFY6gvpOa9xndMXmWiNG3W1jDrCe&access_type=offline&code_challenge=zUI4n_pnYE5V7p0diDQLmL0X0Sk8XpTDzhz_vwtukOo&code_challenge_method=S256&token_usage=remote"
I've tried copying the link that's inside of this and place it in my web browser but I get a page saying.
Error 400: invalid request Missing required parameter: redirect uri
Edit: Though not sure why this is happening now, I added the option "--no-launch-browser" to the end of both commands and it gives me the link to place in my browser now manually and copy code.
On versions of gcloud >= 383.0.0 (26 Apr 2022), Google have removed support for the --console-only and --no-launch-browser flags on their CLI. As far as I can see, they do not give a reason for this, but it is likely security related.
The new intended method for authenticating on a machine without a web browser, is to use the --no-browser flag and copy the command it gives you onto a machine that has both gcloud >= 372.0 and a web browser installed. In other words, it is no longer possible to do this purely on a machine with no browser. See the following steps copied directly from their documentation:
Follow these steps:
Copy the long command that begins with gcloud auth login --remote-bootstrap=".
Paste and run this command on the command line of a different, trusted machine that has local installations of both a web browser and the gcloud CLI version 372.0 or later.
Copy the long URL output from the machine with the web browser.
Paste the long URL back to the first machine under the prompt, Enter the output of the above command, and press Enter to complete the authorization.
Use gcloud init --console-only
"--console-only" below still works even though it's deprecated:
gcloud init --console-only
And "--no-launch-browser" below still works even though it's deprecated:
gcloud init --no-launch-browser
"--no-browser" below doesn't work yet but "--no-browser" will replace "--console-only" and "--no-launch-browser" so in the future, "--no-browser" will work while "--console-only" and "--no-launch-browser" won't work in the future:
gcloud init --no-browser
Because the redirect uri does not contain the whole URL, this can happen. This can be fixed by adjusting the Custom URL Base.
The result will look like this:
https://my_company_artifactory:444/artifactory
You should also double-check that the Custom URL Base and /api/oauth2/loginResponse are included in your Google OAuth settings page's Authorized redirect URIs.
Reviewing for more information, you can add your localhost URL to the redirect URL, it would say it's not possible at this time. When setting the redirect URL before hitting the create button, it accepts it just fine.

How to authorize Google API inside of Docker

I am running an application inside of Docker that requires me to leverage google-bigquery. When I run it outside of Docker, I just have to go to the link below (redacted) and authorize. However, the link doesn't work when I copy-paste it from the Docker terminal. I have tried port mapping as well and no luck either.
Code:
credentials = service_account.Credentials.from_service_account_file(
key_path, scopes=["https://www.googleapis.com/auth/cloud-platform"],
)
# Make clients.
client = bigquery.Client(credentials=credentials, project=credentials.project_id,)
Response:
requests_oauthlib.oauth2_session - DEBUG - Generated new state
Please visit this URL to authorize this application:
Please see the available solutions on this page, it's constantly updated.
gcloud credential helper
Standalone Docker credential helper
Access token
Service account key
In short you need to use a service account key file. Make sure you either use a Secret Manager, or you just issue a service account key file for the purpose of the Docker image.
You need to place the service account key file into the Docker container either at build or runtime.

Is it possible to use `externalbrowser` authenticator inside docker container for connection authentication with Snowflake?

I am trying to use the snowflake connector inside docker container. I want to use the externalbrowser authenticator so that I can make connection using Okta credentials but the connector is failing with below mentioned error.
DatabaseError: (snowflake.connector.errors.DatabaseError) 250008 (08001): None: Failed to connect to DB: xx.snowflakecomputing.com:443, Unable to open a browser in this environment.
(Background on this error at: http://sqlalche.me/e/13/4xp6)
As an aside, I'd recommend removing your account name from the question (shown in the error).
You are correct that the "externalbrowser" option is a browser-based SSO. It might be possible to get this running a docker container with some extended software and configuration, but I wouldn't recommend it as it doesn't seem worth the effort.
Instead, there's alternative SSO authentication methods you can look at such as Native SSO Okta, key-pair authentication, or external OAuth. These won't require the browser.

Umbraco headless Node.js client cannot authenticate headless client

I have been trying to implement the Node.js client for Umbraco headless. I have done the following:
Set up Umbraco headless via my Umbraco cloud subscription
Implemented a simple app in Vue.js
Copied the example code from https://our.umbraco.com/documentation/Umbraco-Cloud/Headless/Headless-Node-Client/
when I run this code (via my localhost:8000) I get an authentication error
https://{MyUmbracoCloudUrl}/umbraco/rest/oauth/token 400 (Bad Request).
My config has the correct domain name for the cloud instance and the correct user name an password.
I get a 400 response from https://{MyDOMAIN}/umbraco/rest/oauth/token.
Does anyone have any ideas? I am not sure if this is a cors issue becuase I am trying to run this from my local host?
Cheers
L

Resources