Octopus Deploy can't use Artifactory Docker registry - docker

Trying to register a docker registry as a feed in octopus deploy. The docker repository is hosted in artifactory.
Octopus deploy returns error:
feed endpoint foobar does not appear to expose a valid docker api

One solution is to provide the exact api endpoint that docker is expecting instead of letting the reverse proxy on artifactory try and rewrite the url.
Instead of using this url
https://artifactory.example.com:5001/artifactory/foobar
Use this url
https://artifactory.example.com:5001/artifactory/api/foobar/v2
Octopus can now use it as a docker registry

Related

Use cache docker image for gitlab-ci

I was wondering is it possible to use cached docker images in gitlab registry for gitlab-ci?
for example, I want to use node:16.3.0-alpine docker image, can I cache it in my gitlab registry and pull it from that and speed up my gitlab ci instead of pulling it from docker hub?
Yes, GitLab's dependency proxy features allow you to configure GitLab as a "pull through cache". This is also beneficial for working around rate limits of upstream sources like dockerhub.
It should be faster in most cases to use the dependency proxy, but not necessarily so. It's possible that dockerhub can be more performant than a small self-hosted server, for example. GitLab runners are also remote with respect to the registry and not necessarily any "closer" to the GitLab registry than any other registry over the internet. So, keep that in mind.
As a side note, the absolute fastest way to retrieve cached images is to self-host your GitLab runners and hold images directly on the host. That way, when jobs start, if the image already exists on the host, the job will start immediately because it does not need to pull the image (depending on your pull configuration). (that is, assuming you're using images in the image: declaration for your job)
I'm using a corporate Gitlab instance where for some reason the Dependency Proxy feature has been disabled. The other option you have is to create a new Docker image on your local machine, then push it into the Container Registry of your personal Gitlab project.
# First create a one-line Dockerfile containing "FROM node:16.3.0-alpine"
docker pull node:16.3.0-alpine
docker build . -t registry.example.com/group/project/image
docker login registry.example.com -u <username> -p <token>
docker push registry.example.com/group/project/image
where the image tag should be constructed based on the example given on your project's private Container Registry page.
Now in your CI job, you just change image: node:16.3.0-alpine to image: registry.example.com/group/project/image. You may have to run the docker login command (using a deploy token for credentials, see Settings -> Repository) in the before_script section -- I think maybe newer versions of Gitlab will have the runner authenticate to the private Container Registry using system credentials, but that could vary depending on how it's configured.

URL of docker hub image for usage in azure service fabric

I have created docker hub repo and also created and pushed a docker image of python application to the repo.
However, I cannot find the correct Url of the image that I have to provide to the other services which will use this image. for eg azure service fabric or Kubernetes.
How can I find the exact URL? Through PowerShell or through the browser...
You don't usually download images by url. Instead, you use the docker CLI with the repository and image name.
If it's a private repo, login first, by using docker login
more about login
Use docker pull {reponame/imagename:tag} to download an image to your machine.
more about pull
Replace {reponame} with the repository name.
Replace {imagename} with the name you used with docker push.
Replace {tag} with the tag you put on the image (or latest).
For example, I use this line to get my docker hub image:
docker pull loekd/nanoserver:2.0

Openshift and Artifactory integration error

I am unable to pull docker images from artifactory on openshift origin.
But i am able to pull same images from server using docker pull.
Error :
Internal error occurred: Get https://artifactory.mycompany.net/v2/: Bad Gateway?
Note: I have enabled proxy and gave no proxy to my org servers in openshift master config file.
I have also added artifactory docker repo as insecure registry in docker configuration
--
Thanks
Open shift don't take wildcard entry in noproxy. So by providing full name in no_proxy got fixed

Artifactory as docker Registry - docker-remote-cache stays empty

i finally managed to get Artifactory 5.1 running as a docker Registry with nginx in front as Reverse Proxy using the subdomain method with a wildcard SSL certificate.
I have the predefinded set of docker repositories configured:
docker-local - repo
docker-remote - remote-repo
docker - virtual repo
I'm able to login with docker cli and i also can push and pull images to and from docker. as mentioned in JFrog Docs.
I think my "docker-remote" doesn't work - it stays at 0 byte with 0 artifacts in it.
If i pull something that isn't in my local repo i would have guessed that it is pulled from docker.io and cached in docker-remote but it seems its simply pulled from docker.io - thats it.
Do i have to configure something? Did i miss something or do i have to configure Replication ?
Any suggestions ?
To configure your Docker CLI to use Artifactory as its registry, follow the instructions here. Make sure to perform the steps listed under "Configuring Your Docker Client".
There are a couple of things you can do to check whether you docker CLI is using Artifactory as its registry:
Use the docker info command to see what registry is configured
Look at the Artifactory request and access logs and look for requests from the Docker CLI
Images fetched from docker.io should be present in the remote repository
Make sure the images you are pulling are not stored in the local Docker cache

docker-cloud repository query doesn't provide a response

I'm trying to query docker-cloud repository using this command
docker-cloud repository ls
I get this response.
NAME IN_USE
My repos are located here and have active images
https://hub.docker.com/r/fellfromhell/workshop-python/
What am I doing wrong?
There are 2 different systems: Docker Hub and Docker Cloud.
Docker Hub is a cloud-based registry. Docker Cloud is a service which helps to deploy and manage Dockerized applications. You can link Docker Cloud to repositories hosted on a third party registry. The Docker Hub registry is linked automatically and all your Docker Hub repositories are listed in https://cloud.docker.com/_/repository/list, but, unfortunately, the command
docker-cloud repository ls
returns only repositories that are on a third party registries (see
https://docs.docker.com/apidocs/docker-cloud/#external-repository).
Despite this, you still can inspect your image
docker-cloud repository inspect fellfromhell/workshop-python
and get some data in response. Here is the github issue https://github.com/docker/dockercloud-cli/issues/23

Resources