Docker Desktop cannot set large disk size - docker

I'm running Docker Desktop 2.2.0 on Windows 10. It appears that the disk size cannot be set beyond 64GB. I tried setting the diskSizeMiB value to 100GB in %APPDATA%\Docker\settings.json, but docker appears to ignore it and set the size to 64GB in the resulting Hyper-V VM.
"cpus": 6,
"diskSizeMiB": 102400,
The issue I'm having is older images being evicted when pulling new ones in. Even when manually expanding the HyperV disk to 100GB, docker pull deletes older images to make space for new ones.
Docker for Windows docs don't seem to explicitly mention a limit, but 64Gb ominously equals 2^16 bytes which hints at it being a technical limit.
Anyone knows of a workaround for this limitation?

Looks like I was on the right track with increasing the virtual disk size directly in Hyper-V (See this guide). The only missing piece was restarting Docker (or Windows). Once restarted, I was able to use the full disk.

Related

Mac docker no space left on device when building images?

I've seen this issue a number of times and usually use docker system prune to solve it temporarily, but i'm not understanding why it says there is no space on the device?
The main drive on my mac currently has 170gb free space, i also have a second drive with 900gb free, the images i'm building take up a total of 900mb when built, so what is docker talking about? I have plenty of storage space!
Since you specified that the platform is Mac, your docker runtime is running inside a VM, which has it's own resources allocated.
Assuming you are using Docker For Mac, you should increase the allocated disk space for the docker VM:
In case you don't want to increase the amount of docker engine storage as answered here, you can free some space by running:
docker image prune

Docker taking up a lot of disk space

I am using Docker Desktop for Windows on Windows 10.
I was experiencing issues with system SSD always being full and moved 'docker-desktop-data' distro (which is used to store docker images and other stuff) out of the system drive to drive D: which is HDD using this guide.
Finally, I was happy to have a lot of space on my SSD... but docker containers started to work slower. I guess this happens due to HDD write/read operations being slower than on SSD.
Is there a better way to solve the problem of the continuously growing size of Docker distro's without impacting how fast containers actually work and images are built?
Actually only be design. As you know, a docker container is layered. So it might be feasible to check if it is possible to create something like a "base-container" from which your actual image in derived.
Also it might be sensible to check if your base distro is small enough. I often have seen containers created from full blown Debian or Ubuntu distros. Thats not the best idea. Try to derive from an alpine version or check for even smaller approaches.

Docker doesn't release / display space after running system prune and informing about reclaiming 16 gb on windows 10 home edition

so I'm really new to docker, and my friend told me that docker system prune run from the elevated cmd prompt suppose to clean pretty much everything, after running it however the message notifying about "reclaiming 16.24 gb" was displayed but my file explorer doesn't show any changes to disk c, restart of docker or host machine didn't help, pruning volumes yield same results. How do I make him release the space or display it correctly (as I don't really know what the case is) ?
I'm not super familiar with the internals of Docker for Windows, but fairly recently it worked by having a small virtual machine with a virtual disk image. The reclaimed disk space is inside that virtual disk image, but the "file" for that image will still remain the same size on your physical disk. If you want to reclaim the physical disk space, there should be a "Reset Docker" button somewhere in the Docker for Windows control panel, which will essentially delete that disk image and create a new, empty one.

Is Docker Maximum container size 100GB?

I am working on Hyperledger Fabric ver 1.4, my server freeze when many transactions are been done. My understanding is that the old transactions or versions which are committed do not get removed but stay in the docker using disk space ( I might be wrong on this assumption ) therefore the only solution I have yet found is increasing my virtual server disk space to 120GB. 100GB for docker and 20GB to run my Front-end development.
I have looked into Alpine images but right now I do not want to take that route.
With Current Configurations I have 75GB SSD and 4GB of RAM
Is there a way to decrease the Max Disk size been used in Ubuntu
16.04 LTS 64bit Minimal?
If not, is the Maximum container size 100GB in Docker?
In GUI interface I can increase and decrease the size as show in the attached image.
GUI Interface Docker

Docker save issue

I am on docker version 1.11.2. I am trying to docker save an image but i get
an error.
i did docker images to see the size of the image and the result is this
myimage 0.0.1-SNAPSHOT e0f04657b1e9 10 months ago 1.373 GB
The server I am on is low on space but it has 2.2 GB available but when I run docker save myimage:0.0.1-SNAPSHOT > img.tar i get
write /dev/stdout: no space left on device
I removed all exited containers and dangling volumes in hopes of making it work but nothing helped.
You have no enough space left on device. So free some more space or try gzip on the fly:
docker save myimage:0.0.1-SNAPSHOT | gzip > img.tar.gz
To restore it, docker automatically realizes that is gziped:
docker load < img.tar.gz
In such a situation where you can't free enough space locally you might want to use storage available over a network connection. A little bit more difficult to set up are NFS or Samba.
The easiest approach could be piping the output through netcat, but keep in mind that this is at least by default unencrypted.
But as long as your production server is that low on space you are vulnerable to a bunch of other problems.
Until you can provide more free space I wouldn't create files locally, zipped or not. You could bring important services down when you run out of free space.

Resources