Docker login: access denied you must use a personal access token - docker

Trying to login from docker to gitlab using the command:
sudo docker login registry.gitlab.com?private_token=XXX
But I still have the following error message:
Error response from daemon: Get https://registry.gitlab.com/v2/: unauthorized: HTTP Basic: Access denied\nYou must use a personal access token with 'api' scope for Git over HTTP.\nYou can generate one at https://gitlab.com/-/profile/personal_access_tokens
The token has the right access I doubled checked... I am rather new to docker, any hint/help? thanks!

The correct command line (that works in my case at least) was:
docker login registry.example.com -u <your_username> -p <your_personal_access_token>

If you are using 2 factor authentication, then personal access tokens are required.
More information on the following webpage,
https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html

According to https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html, your username actually gets ignored:
Though required, GitLab usernames are ignored when authenticating with a personal access token. There is an issue for tracking to make GitLab use the username.
So, if you're not able to connect, it might not be because of the username.

Related

Pushing docker image to gitlab fails: unauthorized

I'm trying to push an image to my gitlab registry which i previously built with success.
docker login registry.gitlab.com
I give the credentials and it returns me a "Login Succeeded"
Then, as always, i do a
docker push registry.gitlab.com/username/registry/base:latest
And it ends with
unauthorized: authentication required
i already tried to
docker logout registry.gitlab.com
and login again.
The process can be found here, it's pretty simple
link to github/gitlabhq
I'm used to do it like that, first time i face the issue, don't understand
Any help appreciated !
Ensure that your account has read/write access to the registry you are trying to access. What you might need to do is to create a new Access Token as there is a difference between API/Access tokens and your "normal" user password. Use this access token as described in the documentation (https://github.com/gitlabhq/gitlabhq/blob/master/doc/user/packages/container_registry/index.md#authenticate-with-the-container-registry)
docker login registry.example.com -u <username> -p <token>
The token can be created by going to Edit Profile -> Access Tokens -> Select Scopes -> Ticking off 'Read registry' & 'Write registry'
The Gitlab support told me this is due to a native limitation of the token duration.
You cannot customize this duration in Saas mode.
So pushing large image results in auto logout.

Can login into docker-registry but not push image (github)

So I want to use the docker registry from GitHub.
I do the flowing:
docker login docker.pkg.github.com --username username
docker build . --tag docker.pkg.github.com/user-name/repo/IMAGENAME:snapshot
docker push docker.pkg.github.com/user-name/repo/IMAGENAME:snapshot
Note that the repository is private and not mine but I got write access to it.
When I go to packages tab I can also see the instructions on how to get started and I follow them(kind of, I tag the docker image in one go).
But when I run the 3 commands at the top I get the following output(push command fails):
unauthorized: Your token has not been granted the required scopes to execute this query. The 'id' field requires one of the following scopes: ['read:packages'], but your token has only been granted the: [''] scopes. Please modify your token's scopes at: https://github.com/settings/tokens.
When I visit the site referenced there is nothing there only unrelated tokens.
Any ideas what I could try or what may cause this...?
You need to use an API Token to log in like shown in the docs. Log in via password is not possible.
https://help.github.com/en/packages/using-github-packages-with-your-projects-ecosystem/configuring-docker-for-use-with-github-packages
You must use a personal access token.

Permission issues while docker push

I'm trying to push my docker image to google container image registry but get an error which says I do not have the needed permission to perform this operation.
I have already tried gcloud auth configure-docker but it doesn't work for me.
I first build the image using:
docker build -t gcr.io/trynew/hello-world-image:v1 .
Then I'm trying to attach a tag and push it:
docker push gcr.io/trynew/hello-world-image:v1
This is my output :
The push refers to repository [gcr.io/trynew/hello-world-image]
e62774cdb1c2: Preparing
0f6265b750f3: Preparing
f82351274ce3: Preparing
31a16430afc8: Preparing
67298499a3ed: Preparing
62d5f39c8fe4: Waiting
9f8566ee5135: Waiting
unauthorized: You don't have the needed permissions to perform this
operation, and you may have invalid credentials.
To authenticate your request, follow the steps in:
https://cloud.google.com/container-registry/docs/advanced-authentication
Google cloud services have specific information how to grant permissions for docker push, this is the first thing you should have a look I think, https://cloud.google.com/container-registry/docs/access-control
After checking that you have sufficient permissions you should proceed with authentication with something like:
gcloud auth configure-docker
See more here: https://cloud.google.com/container-registry/docs/pushing-and-pulling
If you are running docker as root (i.e. with sudo docker), then make sure to configure the authentication as root. You can run for example:
sudo -s
gcloud auth login
gcloud auth configure-docker
...that will create (or update) a file under /root/.docker/config.json.
(Are there any security implications of gcloud auth login as root? Let me know in the comments.)
In order to be able to push images to the private registry you need two things: API Access Scopes and Authenticate your VM with the registry.
For the API Access Scopes (https://cloud.google.com/container-registry/docs/using-with-google-cloud-platform) we can read in the official documentation:
For GKE:
By default, new Google Kubernetes Engine clusters are created with
read-only permissions for Storage buckets. To set the read-write
storage scope when creating a Google Kubernetes Engine cluster, use
the --scopes option.
For GCE:
By default, a Compute Engine VM has the read-only access scope
configured for storage buckets. To push private Docker images, your
instance must have read-write storage access scope configured as
described in Access scopes.
So first, verify if your GKE cluster or GCE instance actually has the proper scopes set.
The next is to authenticate to the registry:
a) If you are using a Linux based image, you need to use "gcloud auth configure-docker" (https://cloud.google.com/container-registry/docs/advanced-authentication).
b) For Container-Optimized OS (COS), the command is “docker-credential-gcr configure-docker” (https://cloud.google.com/container-optimized-os/docs/how-to/run-container-instance#accessing_private_google_container_registry)
Windows / Powershell
I got this error on Windows when I was trying to run docker push from a normal powershell window after authenticating in the google cloud shell that had opened when I installed the SDK.
The solution was simple:
Start a new powershell window to run docker push after running the gcloud auth configure-docker command.
Make sure you've activated the registry too:
gcloud services enable containerregistry.googleapis.com
Also Google has a tendency to jump to a default account (maybe your personal gmail) which may or may not be the one you want (your business email). Make sure if you're opening any links in a browser that you're in the correct Google account.
I'm not exactly sure what's going on yet because I'm brand new to docker, but something got refreshed when starting a new Powershell instance.
as noted https://stackoverflow.com/a/59799035/26283371 there appears to be a bug in the Linux version of cloud sdk where authentication fails using the standard authentication method (gcloud auth configure-docker). Instead, create a JSON keyfile per this and that tends to work.
I still can't get the gcloud auth configure-docker helper to work. What did was authenticating with an access token, like so
gcloud auth print-access-token | docker login -u oauth2accesstoken --password-stdin https://HOSTNAME
where HOSTNAME is gcr.io, us.gcr.io, eu.gcr.io, or asia.gcr.io. (Be sure to include https://, otherwise it won't work).
You can view options for print-access-token here.
First thing, Make sure you covered all points listed in the following official documentation
https://cloud.google.com/container-registry/docs/advanced-authentication
This error occurs mostly due to docker config update, which you can check using command cat .docker/config.json
Now update with gcr with following command
gcloud auth configure-docker
Just in case anyone else is banging their head against a wall my PIA VPN caused this behavior.
"unauthorized: You don't have the needed permissions to perform this operation, and you may have invalid credentials. To authenticate your request, follow the steps in: https://cloud.google.com/container-registry/docs/advanced-authentication"
Turn my VPN off and it works fine. Turn it back on and it breaks again.
This is the only way that worked for me. I found it in a kubernetes/kompose Github issue.
Remove the credsStore key in ~/.docker/config.json
This will force docker to write the auth into the json when you use docker login. You can't untick Securely store Docker logins in macOS keychain in the docker desktop any more -- and the current credStore is no longer macOS keychain, it's desktop.
gcloud auth login Auth with gcloud (just to be explicit)
gcloud auth print-access-token | docker login -u oauth2accesstoken --password-stdin https://eu.gcr.io
You should see this:
WARNING! Your password will be stored unencrypted in /Users/andrew/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store
Login Succeeded
Source: https://github.com/kubernetes/kompose/issues/1043#issuecomment-609019141
The fix is as follows: run gcloud auth login (the browser will open and allow you to authenticate) then run gcloud auth configure-docker and select Y - then redo push. It should work like charm.
I also have the same issue in the Linux environment. So I just set the Docker to run as a non-root user, (https://docs.docker.com/engine/install/linux-postinstall/#manage-docker-as-a-non-root-user), and it works.
In my case DOCKER_CONFIG env variable was defined with an invalid value (not pointing to a docker config json).
I had the same issue, but for me, the problem was with internal users in my Linux system. I authenticated with gcloud my personal Linux user and when pushing, I was doing with root. So I had to authenticate my root user with gcloud as well:
sudo gcloud init
This issue happens to me when i switch service account which is pointing to different GCP Projects. Even though the service account has permission to push it says it does not have the permission. To resolve this by deleting config.json file which is present in .docker
Once this is done run the below commands and you should be able to push the image.
gcloud auth configure-docker
gcloud auth print-access-token | docker login -u oauth2accesstoken --password-stdin https://HOSTNAME
Where HOSTNAME= gcr.io , asia.gcr.io etc

401 Unauthorized Error while login into nexus docker registry

I am using nexus as a Docker container, with tag sonatype/nexus3:3.14.0. Also, I connect nexus with LDAP for user better user management it is helpful for group and role management.
For my case, I create a blog-store and create two docker registry repository, one hosted and one group. I try to log in, on hosted and it works fine. But when I tried to connect into the grouped repository I get
401 Unauthorized. I also tried to connect with admin credentials but I get the same error too,
Error response from daemon: login attempt to https:///v2/ failed with status: 401 Unauthorized.
Suggestions are welcome
PParthenis
Enable the Docker Bearer Token Realm in Nexus Security->Realms Tab.
As stated in here
In my case Docker Bearer Token Realm security realm was already enabled. But prioritizing this realm did the trick.
If Docker Bearer Token Realm is already enabled in Nexus Security->Realms Tab, Increase its priority.
For more info https://help.sonatype.com/repomanager3/system-configuration/access-control/realms

Credentials do not work for "docker login"

Copy/pasting my username and password into the Docker Hub website works fine.
The password is long, but does not contain shell-breaking symbols.
Copy/pasting those same credentials into command-line docker login results in an incorrect username or password error. I have tried passing the credentials interactively (both copy/pasting and typing) and through command line args, same result:
# INTERACTIVE
$ docker login
Login with your Docker ID to push and pull images from Docker Hub. If you don't have a Docker ID, head over to https://hub.docker.com to create one.
Username: my#email.com
Password: <REDACTED>
Error response from daemon: Get https://registry-1.docker.io/v2/: unauthorized: incorrect username or password
# COMMAND LINE
$ docker login -u my#email.com -p <REDACTED>
Error response from daemon: Get https://registry-1.docker.io/v2/: unauthorized: incorrect username or password
#mustaccio was correct.
The Docker Hub website allows you to login with either your username OR your email, and the website does not require a case-correct username.
docker login DOES require a case-correct username, and DOES NOT work with your email address.
When I signed up I chose a camel-cased username e.g.:
MyUsername
Docker forces this username to all lower case in practice. When you log in, you'll see your correct username in the upper right-hand corner of the website. In this example:
myusername
The website allows you to login with MyUsername or myusername.
docker login only allows myusername.
Same issue happens if you didn't logout and if you have put EMAIL address for login.
docker logout
docker login
DONOT PUT EMAIL ADDRESS, instead ENTER USERNAME
I used a password generator that put special characters in my password, I was able to login in my browser but not through the cli. I changed it to just letters and numbers and it worked.
Try this:
sudo chmod 666 /var/run/docker.sock
sudo docker login
I solved the problem by this way.
Go to https://hub.docker.com/settings/security and create a new token in my account.
To login use
docker login command
And type:
myusername in lowercase and
token (instead of password)
If You are using git bash on Windows then use following command:
winpty docker login --username <yourusername>
It will prompt for password. Enter your password. Message "Login Succeeded" displayed.
"yourusername" you can get on right top corner when logged in docker official site.
With your email it is not working.
If anyone is still experiencing this after ensuring your user and pass is correct, generate a token at account settings>security>new access token , I was struggling for an hour but using a token worked. Don't know why
On Windows:
Right click docker on the system tray and sign out. Then sign in but use your docker hub username not email address.
For logging into hub.docker.com, what worked for me was make sure you DO NOT specify 'hub.docker.com'!!! Different versions of API-DNS? All of this after I spent a good hour changing my password, changing my token, etc.
Just simply:
docker login --username=myusername <enter>
theToken <enter>
Adding special characters like # in password may mess-up login to docker from terminal , even if we use correct case and everything else correct . But it will work from website if we use a special char in password.
So basically use letters and number only in password , don't use special char for docker login
docker login
Do not use email id. Use your username in smallcase and then give password.
You should be successfully login.
please login with your DOCKER ID and password
this would work instantly

Resources