Since two days, i don't know why, i cannot build a docker image from Docker Hub Registry with linked github projects (it's working previously)
Many answers were about a github submodule
It was not my case.
Here are the logs
Building in Docker Cloud's infrastructure...
Cloning into '.'...
Warning: Permanently added the RSA host key for IP address '192.30.253.113' to the list of known hosts.
Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
please ensure the correct public key is added to the list of trusted keys for this repository (128)
I tried searching, I did not fall on the answers that not concerned me.
After many hours of searching, I think found the answer.
I put here the answer, for those who would fall on the same problem.
It looks like Docker have changed is the name of his app. Rename from Docker Hub Registry to Docker Hub Builder
For repair it, so I cut the application link between Docker and Github. And re-create the link.
It added me an application for authorization in Github, the proof in image width Docker Hub Builder
May someone need this. For me, clicking the re-link button for the GitHub account on the Docker-Hub solved the problem.
You can re-link in this page: https://hub.docker.com/settings/linked-accounts
First, try re-linking your GitHub account to Docker Hub and checking your GitHub org's 'Third party application access policy', as suggested by the previous answers.
If your Docker Hub repository is owned by an organisation, visit the organisation's Linked Accounts settings and try re-linking the provider:
If that doesn't solve the issue, visit your Docker Hub repository > Builds > Configure Automated Builds, then look at the Source repository list:
This will show you which repositories Docker Hub has access to.
In my case, some repositories were missing from the list because Docker Hub could only see repositories that the linked GitHub account had admin access to.
Conclusion: the GitHub account linked with Docker Hub needs admin access to the relevant GitHub repository.
Giving the GitHub account admin access to the relevant repository fixed my Docker Hub builds.
Related
I am new to this. I am now having a project which is developing an application. From what I known, normally people will store it on docker hub by then creating a docker container and pass it to Kubernetes. But what we are trying to do is that we wish to store the repositories without using the docker hub. Is there a way to build something like that from scratch? Which direction should I look into?
You can install jfrog or nexus artifact management server to create repository and store your image to that artifact management repo.
Please check this link for more details.
So I have an application called SongKong, I wanted to build a Docker image for it. Within hub.docker.com you can link to a repository containing Dockerfile and build from it, I do not want to call this repos songkong because I already have a songkong repo for the actual application code. So I called the repo songkongdocker but now my hub.docker.com repo is called songkongdocker, it would seem that hub.docker repos are usually named after the application so ideally should just be songkong
So, what is the correct way to name and can my Bitbucket code repository have a different name then the hub.docker.com repository ?
You can change build/repo name when you are creating an Automated build on hub.docker.com. Bitbucket/Github repo name is used as a default name, but you can still edit it.
I am quite newbie in docker, and I am trying to find the way to tell version for a docker hub tagged image.
For instance, the jenkins/jenkins:lts-latest image, listed here https://hub.docker.com/r/jenkins/jenkins/tags/, what image version does actually aliase? And how can I infer the correspondent dockerfile/branch in jenkins repo?
I tried with docker search but I couldn't. I tried also to find a clue in the official Jenkins github dockerfile repo: https://github.com/jenkinsci/docker, but I don't see any bindung tag or anything that gives me a hint on the source of the image
Another example, I have a Kubernetes cluster, and when I check my Nexus pod, I see likewise that the image is defined as sonatype/nexus3:latest.
In this case at least I have the imageID: docker-pullable://sonatype/nexus3#sha256:434a2564aa64646464afaf.. but once again I don't know how to map it to the actual version of the software
For the repo you asked, the answer is No.
When setup repo on dockerhub, there are two kinds of options for user to choose as follows:
1) Create Repository:
In this way, dockerhub just create a repo for user, and user need to build his own image on local server, tag it, and push it to dockerhub.
When user push his image to dockerhub, no additional information about the source version will be appended, so can't get any source map from dockerhub.
jenkins/jenkins, just this kind of repo.
2) Create Automated Build
In this way, dockerhub will fetch the code from github or bitbucket, and build the image on its cloud infrastructure, so it will know exactly what source commit is for current docker image.
jenkins/jnlp-slave, just this kind of repo.
Then, you can click its Build Details on the web page, click into one link, e.g. 3.26-1-alpine, you will see log mentioned 0a0239228bf2fd26d2458a91dd507e3c564bc7d0 is the source commit.
To sum up, for the repo you mentioned in the question, they are not Automated Build, so you cannot get the map for the image & source code, but if you happen to find a repo in dockerhub which is Automated Build later & want to know the map, then you can.
As long as I understand your question, you are trying to tag the docker image exact with same version as of your software version. For that I use to create image tag:
$ export VERSION="2.31-b19"
$ docker tag "<user>/<image>:${VERSION}" "<docker_hub_user>/<repo>:latest"
If this is not the case. Please explain your use case a bit more so that we can provide you a better workaround.
I have a personal account, on Docker Hub, linked to my GitHub account, where I can build an image of my repository normally.
Now I've created an organization on GitHub where I've forked my code. I've also created an organization on Docker Hub using my personal account, and created a repository in this organization. But I can't seem to figure out how to trigger a build in this repository!
I don't have access to the same menus, I don't know what I'm missing here. Any clue? Thanks.
I think you have to create an 'Automated build', not a simple repository.
In the top left of the UI, go to 'Create' > 'Create Automated Build', then select Github or Bitbucket, select the source repository, then for the 'Repository Namespace & Name' I guess that you should be able to choose your organization for the namespace.
Of course you have to deleted the simple repository you created if you want to use the same repository name.
I know that Docker hub is there but it allows only for 1 private repository. Can I put these images on Github/Bitbucket?
In general you don't want to use version control on large binary images (like videos or compiled files) as git and such was intended for 'source control', emphasis on the source. Technically, here's nothing preventing you from doing so and putting the docker image files into git (outside the limits of the service you're using).
One major issue you'll have is git/bitubucket have no integration with Docker as neither provide the Docker Registry api needed for a docker host to be able to pull down the images as needed. This means you'll need to manually pull down out of the version control system holding the image files if you want to use it.
If you're going to do that, why not just use S3 or something like that?
If you really want 'version control' on your images (which docker hub does not do...) you'd need to look at something like: https://about.gitlab.com/2015/02/17/gitlab-annex-solves-the-problem-of-versioning-large-binaries-with-git/
Finally, docker hub only allows one FREE private repo. You can pay for more.
So the way to go is:
Create a repository on Github or Bitbucket
Commit and push your Dockerfile (with config files if necessary)
Create an automated build on Docker Hub which uses the Github / Bitbucket repo as source.
In case you need it all private you can self-host a git service like Gitlab or GOGS and of course you can also selfhost a docker registry service for the images.
Yes, since Sept. 2020.
See "Introducing GitHub Container Registry" from Kayla Ngan:
Since releasing GitHub Packages last year (May 2019), hundreds of millions of packages have been downloaded from GitHub, with Docker as the second most popular ecosystem in Packages behind npm.
Available today as a public beta, GitHub Container Registry improves how we handle containers within GitHub Packages.
With the new capabilities introduced today, you can better enforce access policies, encourage usage of a standard base image, and promote innersourcing through easier sharing across the organization.
Our users have asked for anonymous access for public container images, similar to how we enable anonymous access to public repositories of source code today.
Anonymous access is available with GitHub Container Registry today, and we’ve gotten things started today by publishing a public image of our own super-linter.
GitHub Container Registry is free for public images.
With GitHub Actions, publishing to GitHub Container Registry is easy. Actions automatically suggests workflows based for you based on your work, and we’ve updated the “Publish Docker Container” workflow template to make publishing straightforward.
GitHub is in the process of releasing something similar to ECR or Docker Hub. At the time of writing this, it's in Alpha phase and you can request access.
From GitHub:
"GitHub Package Registry is a software package hosting service, similar to npmjs.org, rubygems.org, or hub.docker.com, that allows you to host your packages and code in one place. You can host software packages privately or publicly and use them as dependencies in your projects."
https://help.github.com/en/articles/about-github-package-registry
I guess you are saying about docker images. You can setup your own private registry which will contain the docker images. If you are not pushing only dockerfiles, but are interested in pushing the whole image, then pushing the images as a whole to github is a very bad idea. Consider a case you have 600 MB of docker image, pushing it to github is like putting 600 MB of data to a github repo, and if you keep on pushing more images there, it will get terribly bad.
Also, docker registry does the intelligent mapping of storing only a single copy of a layer (this layer can be referenced by multiple images). If you use github, you are not going to use this use-case. You will end up storing multiple copies of large files which is really really bad.
I would definitely suggest you to go with a private docker registry rather than going with github.
If there is a real need of putting docker image to github/bitbucket you can try to save it into archive (by using https://docs.docker.com/engine/reference/commandline/save/) and commit/push it to your repository.