Authenticating azcopy or az storage cli to upload to Azurite docker emulator - azcopy

I started an Azurite docker on local VM and then tried to copy data to it by azcopy and az CLI like below
export AZURE_STORAGE_ACCOUNT="devstoreaccount1"
export AZURE_STORAGE_ACCESS_KEY="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="
azcopy copy /local/data/ http://localvm:10000/devstoreaccount1/data/test --from-to LocalBlob
INFO: Scanning...
failed to perform copy command due to error: Login Credentials missing. No SAS token or OAuth token is present and the resource is not public
I want to authenticate with the Account key and Account name and preferably be able to copy using azcopy.
I scoured the GitHub and stack to find only one https://github.com/Azure/azure-storage-azcopy/issues/867 issue and there is nothing there regarding auth. It looks like I am missing something that's obvious. Your help will be much appreciated.
The version used were:
azure-cli 2.11.1
azcopy version 10.7.0

I was able to getaway with using az cli instead of azcopy.
export AZURE_STORAGE_CONNECTION_STRING="DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://azuritedockerhost:10000/devstoreaccount1;"
az storage blob upload -f local-file -c container-name -n dir/blob-name
Hope this helps someone. Plus it would really nice to be able to use azcopy too so if anybody finds out how it will greatly appreciated.

The Microsoft documentation 'Get started with AzCopy' indicates the following under the 'Run AzCopy' heading:
As an owner of your Azure Storage account, you aren't automatically
assigned permissions to access data. Before you can do anything
meaningful with AzCopy, you need to decide how you'll provide
authorization credentials to the storage service.
Under the next heading 'Authorize AzCopy', the documentation states:
You can provide authorization credentials by using Azure Active Directory
(AD), or by using a Shared Access Signature (SAS) token.
Even though you're accessing a local storage emulator (Azurite) on your local machine, the AzCopy app wants an OAuth token or SAS token. See this link to generate SAS tokens for local storage or online storage.
A SAS token must be appended to the destination parameter in the azcopy copy command. I use the Active AD (OAuth token) authorization credentials option so that I can run multiple azcopy commands without appending a SAS token to every command.
To resolve the AzCopy error you're getting "Failed to perform copy command due to error: Login Credentials missing. No SAS token or OAuth token is present and the resource is not public", enter the following into a command prompt or Windows PowerShell:
azcopy login --tenant-id=<your-tenant-directory-id-from-azure-portal>
and then follow the steps this command returns. Here's a reference to azcopy login. From the heading 'Authorize without a secret store' in this reference: "
The azcopy login command retrieves an OAuth token and then places that
token into a secret store on your system.
From 'Authorize a user identitiy' heading:
After you've successfully signed in, you can close the browser window
and begin using AzCopy.
Use azcopy logout from a command prompt to stop any more AzCopy commands.
Here are the steps with screen captures for the login process as well as where to find a tenant ID to get the AzCopy login process going.
Get tenant ID from the Azure portal.
In a command prompt enter the azcopy login command along with the --tenant-id parameter.
Follow the steps indicated in the command prompt: "...use a web browser to open the page https://microsoft.com/devicelogin and enter the code...".
"A sign-in window will appear. In that window, sign into your Azure account by using your Azure account credentials."
"After you've successfully signed in, you can close the browser window and begin using AzCopy."
You can run your original azcopy copy /local/data/ http://localvm:10000/devstoreaccount1/data/test --from-to LocalBlob without the need for the export entries in your question.

AzCopy deliberately avoids support for account key authentication, because an account key has full admin privileges: https://github.com/Azure/azure-storage-azcopy/issues/186
The only workaround I have found so far is to generate a SAS (for the container) in Azure Storage Explorer, and then use the SAS URL with AzCopy.

Related

Berglas not finding my google cloud credentials

I am trying to read my google cloud default credentials with berglas, and it says that:
failed to create berglas client: failed to create kms client: google: could not find default credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.
And I am passing the right path, and i have tried with many paths but none of them work.
$HOME/.config/gcloud:/root/.config/gcloud
I'm unfamiliar with Berglas (please include references) but the error is clear. Google's client libraries attempt to find credentials automatically. The documentation describes the process by which credentials are sought.
Since the credentials aren't being found, you're evidently not running on a Google Cloud compute service (where credentials are found automatically). Have you set an environment variable called APPLICATION_DEFAULT_CREDENTIALS and is it pointing to a valid Service Account key file?
The Berglas' README suggests using the following command to auth your user's credentials as Application Default Credentials. You may not have completed this step:
gcloud auth application-default login

How to authorize Google API inside of Docker

I am running an application inside of Docker that requires me to leverage google-bigquery. When I run it outside of Docker, I just have to go to the link below (redacted) and authorize. However, the link doesn't work when I copy-paste it from the Docker terminal. I have tried port mapping as well and no luck either.
Code:
credentials = service_account.Credentials.from_service_account_file(
key_path, scopes=["https://www.googleapis.com/auth/cloud-platform"],
)
# Make clients.
client = bigquery.Client(credentials=credentials, project=credentials.project_id,)
Response:
requests_oauthlib.oauth2_session - DEBUG - Generated new state
Please visit this URL to authorize this application:
Please see the available solutions on this page, it's constantly updated.
gcloud credential helper
Standalone Docker credential helper
Access token
Service account key
In short you need to use a service account key file. Make sure you either use a Secret Manager, or you just issue a service account key file for the purpose of the Docker image.
You need to place the service account key file into the Docker container either at build or runtime.

Docker login: access denied you must use a personal access token

Trying to login from docker to gitlab using the command:
sudo docker login registry.gitlab.com?private_token=XXX
But I still have the following error message:
Error response from daemon: Get https://registry.gitlab.com/v2/: unauthorized: HTTP Basic: Access denied\nYou must use a personal access token with 'api' scope for Git over HTTP.\nYou can generate one at https://gitlab.com/-/profile/personal_access_tokens
The token has the right access I doubled checked... I am rather new to docker, any hint/help? thanks!
The correct command line (that works in my case at least) was:
docker login registry.example.com -u <your_username> -p <your_personal_access_token>
If you are using 2 factor authentication, then personal access tokens are required.
More information on the following webpage,
https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html
According to https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html, your username actually gets ignored:
Though required, GitLab usernames are ignored when authenticating with a personal access token. There is an issue for tracking to make GitLab use the username.
So, if you're not able to connect, it might not be because of the username.

Can login into docker-registry but not push image (github)

So I want to use the docker registry from GitHub.
I do the flowing:
docker login docker.pkg.github.com --username username
docker build . --tag docker.pkg.github.com/user-name/repo/IMAGENAME:snapshot
docker push docker.pkg.github.com/user-name/repo/IMAGENAME:snapshot
Note that the repository is private and not mine but I got write access to it.
When I go to packages tab I can also see the instructions on how to get started and I follow them(kind of, I tag the docker image in one go).
But when I run the 3 commands at the top I get the following output(push command fails):
unauthorized: Your token has not been granted the required scopes to execute this query. The 'id' field requires one of the following scopes: ['read:packages'], but your token has only been granted the: [''] scopes. Please modify your token's scopes at: https://github.com/settings/tokens.
When I visit the site referenced there is nothing there only unrelated tokens.
Any ideas what I could try or what may cause this...?
You need to use an API Token to log in like shown in the docs. Log in via password is not possible.
https://help.github.com/en/packages/using-github-packages-with-your-projects-ecosystem/configuring-docker-for-use-with-github-packages
You must use a personal access token.

How to use gcloud auth list in python

We could run "gcloud auth list" to get our credentialed account, and now I want to do the same thing in my python code, that is checking the credential account by API in python. But I didn't fine it..... Any suggestion?
More information is:
I want to check my account name before I create credentials
CREDENTIALS = GoogleCredentials.from_stream(ACCOUNT_FILE)
CREDENTIALS = GoogleCredentials.get_application_default()
gcloud stores credentials obtained via
gcloud auth login
gcloud auth activate-service-account
in its internal local database. There is no API besides gcloud auth list command to query them. Note that this is different (usually a subset) from the list of credentials in GCP.
Credentials used by gcloud are meant to be separate from what you use in your python code.
Perhaps you want to use
https://cloud.google.com/sdk/gcloud/reference/iam/service-accounts/keys/list, there is also API for that https://cloud.google.com/iam/docs/creating-managing-service-accounts.
For application default credentials you would download json key file using developer console https://console.cloud.google.com/iam-admin/serviceaccounts/project?project=YOUR_PROJECT or use gcloud iam service-accounts keys create command.
There is also gcloud auth application-default login command, which will create application default credential file in well known location, but you should not use it for anything serious except perhaps developing/testing. Note that credentials obtained via this command do not show up in gcloud auth list.

Resources