I see we can install tensorflow (with GPU) using Docker - here TensorFlow - which Docker image to use?
But how do I do this on a machine that has no external internet connection?
Is there a way to first download the tensor flow image
b.gcr.io/tensorflow/tensorflow:latest-gpu: TensorFlow GPU binary image
and then copy it to local file space and "install" it from there?
You can pull the image on a computer that have access to the internet.
sudo docker pull b.gcr.io/tensorflow/tensorflow:latest-gpu
Then you can save this image to a file
sudo docker save -o tensorflow_image.docker b.gcr.io/tensorflow/tensorflow:latest-gpu
Transfer the file on the offline computer (USB/CD/whatever) and load the image from the file:
sudo docker load < tensorflow_image.docker
Courtesy: https://serverfault.com/questions/701248/downloading-docker-image-for-transfer-to-non-internet-connected-machine
Related
I am trying create a container for Jupyter datascience notebook on "Play-with-docker" using following links:
https://labs.play-with-docker.com/
https://hub.docker.com/r/jupyter/datascience-notebook/
docker run -it -p 8888:8888 jupyter/datascience-notebook
"Play with docker" has its own limitation as its a free source to play around and test docker in a browser.
By default it downloads an image at "/var/lib/docker" and there's only 4.7 GB of space available in /var/ location. With this as default, I am unable to download the image as it throws error of disk quota exceeded
Although "/etc" seems to have 64GB.
Tried changing the graph directory but unable to do so as this platform doesn't allow to restart the docker service or daemon.
Is there any way we can change download location on "Play-with-docker" so it allows downloading bigger images?
I want to download some images for a computer that has not internet.
My computer that have internet has NO DOCKER installed (old kernel) so it is not possible to use docker command to pull, save and export it to the other machine.
I'm looking for a way to download a docker image (like via wget, ...) and use it on my computer without Internet.
Yes that's possible. Docker has the features save and load.
Run this command on your machine with the image you want to copy to the other computer:
docker save myimage > myimage.tar
To load the image again run:
docker load < myimage.tar
If you don't have access to a machine supporting docker in any way what you can do is create a repository on quay.io with a dockerfile like
FROM myimage
...
quay actually allows to download images from the web panel whereas docker hub/store does not afaik.
I have a very slow Internet connection at office most of the times. I do have access to a cloud based Ubuntu machine where Internet connection is very fast. I would like to use this machine to bring several docker images locally and would like to download the 'images' folder to my local computer via ftp or other means. If I copy all the contents of the images folder, will my local computer be able use these images as if itself has downloaded them from Docker Hub?
Thanks for your help in advance.
Q: If I copy all the contents of the images folder, will my local computer be able use these images as if itself has downloaded them from Docker Hub?
A: Instead of copying the image files directly I would recommend using the appropriate tools to export your image. Consider using docker save to export your images into flat files (tar format).
See:
https://docs.docker.com/engine/reference/commandline/save/#save
Essentially you will be doing something like
docker save --output busybox.tar busybox
Then load it back into your work machine using docker load --input busybox.tar
See:
https://docs.docker.com/engine/reference/commandline/load/
I have boot2docker running on OS X 10.10.
I used docker to install conceptnet5, a 50GB big database that takes days to download from my location.
Now, somebody requested an Ubuntu VM with conceptnet5 running on it in a docker container from me.
So, to avoid downloading everything again, I wondered if there is a way to transfer conceptnet5's container from boot2docker to my newly created ubuntu vm.
Here is the docker container I'm using.
You could also work with save and load command.
The save command will produces a tarred repository of the image. It will contains all parent layers, and all tags.
$ docker save myimage -o myimage.tar
# Or even better, gzip it using unix pipes
$ docker save myimage | gzip > myimage.tar.gz
Now you have a tarball with all the layer and metadata that you can pass around, offline, with usb keys & stuff.
To load it back, it's the load command. The load command will work with the following compression algorithm : gzip, bzip2 and xz.
$ docker load -i myimage.tar.gz
# or with pipes
$ docker load < myimage.tar.gz
It's a little bit easier than running a private registry, but both works well.
You can setup a private docker registry and then push the image there. Hopefully this private registry is on your local network so you should get much higher throughput. Then you can pull down the pushed image in your new Ubuntu VM.
Say I want to get a Docker image (such as https://github.com/zettio/weave/blob/master/weaver/Dockerfile) downloaded and built from the Internet, but then take it to a computer not connected to the Internet, so that computer can run a container using it?
What files/directories would I need to archive to do that?
Save it, copy the tar file to a USB, go to other computer and load it.
docker save -o image.tar image
then on other computer:
docker load -i image.tar