How is disk memory being used up in Google Colab? - memory

The moment I mount my Google Drive into Google Colab most of the disk memory gets used up.
I mount by running the following cell
# Mount Google Drive (Run this in Google Colab environment)
from google.colab import drive
drive.mount('/content/drive')
Once mounted, without saving any model, there is only 29 GB left out of the the 68.4 GB provided. What would be taking up so much memory? Or, how do I check what is taking up the memory?
Thanks!!

This should not happen. mine started at 69.46 GB and remained the same.
Maybe GPU resource is being allocated which does reduce the disk size. Mine was reduced to 29.83 GB.
In this case, gpu can be disabled by going to Edit -> notebook settings -> under hardware acceleration which should be on none, to disable GPU.
And to check for disk usage
enter command !du /path/to/your/folder -h
you can search for the entire machine with this command !du / -h which scans the entire virtual machine.

Related

Sabrent SSD not showing with new hard drive

Wanted to post this because I found couple issues on this across community, but not on Stack.
Issue: Sabrent External SSD enclosure not showing in explorer, or disk partition tool; but shows under disk driver management.
Resolution: SSD was not activated, and could not access through partition tool because unallocated and uninitialized. Download - https://sabrent.com/collections/memory-accessories/products/ec-snve . Then create NTFS with GPT partition (need GPT if 2TB+ because MBR has limitations to sizing). Should then show under explorer and disk management tools.
FYI: No need to update driver.

Increasing the storage space of a docker container on Windows to 2-3TB

Working on a windows computer with 5TB available space. Working on building an application to process large amounts of data that uses docker containers to create replicable environments. Most of the data processing is done in parallel using many smaller docker containers, but the final tool/container requires all the data to come together in one place. The output area is mounted to a volume, but most of the data is just copied into the container. This will be multiple TBs of storage space. RAM luckily isn't an issue in this case.
Willing to try any suggestions and make what changes I can.
Is this possible?
I've tried increasing disk space for docker using .wslconfig but this doesn't help.

Docker, set Memory Limit for group of container

I'm running a docker environment with roundabout 25 containers for my private use. My system has 32GB Ram.
Especially RStudio Server and JupyterLab often need a lot of memory.
So I limited the memory to for both container at 26 GB.
This works good as long not both application storing dataframes in memory. If now RServer stores some GB and Jupyter also is filling memory to the limit my system crashes.
Is there any way to configure that these two container together are allowed to use 26GB ram max.
Or a relative limit like Jupyter is allowed to use 90% of free memory.
As I'm working now with large datasets it can happen all the time that (because I forget to close a kernel or something else) the memory can increase to the limit and I want just the container to crash and not the whole system.
And I don't want lower the limit for Jupyter further as the biggest dataset on its own need 15 GB of memory.
Any ideas?

Alibaba Cloud ECS always increasing disk space?

I need help with my Alibaba Cloud Elastic Compute Service!
screenshot of df -h
screenshot of iotop -a
My storage was only 40GB and I used 33GB, on the next day it became fully consumed. The usage was at 40GB even though I had not uploaded any files.
I decided to increase my storage to 60GB, but 3 days later it has become fully consumed again.
I badly need help.
Thank you.
Free up some space and install iotop. Using iotop -a you can leave it running for a while and check back to find which processes are consuming your disk.

Docker 'killing' my program

I am running a data analysis code in docker using pandas on MacOS.
However, the program gets killed on high memory allocation in a data frame (I know because it gets killed when my program is loading a huge dataset).
Without the container, my program runs alright on my laptop.
Why is this happening and how can I change this?
Docker on MacOS is running inside a Linux VM which has an explicit memory allocation. From the docs:
MEMORY
By default, Docker for Mac is set to use 2 GB runtime memory,
allocated from the total available memory on your Mac. You can
increase the RAM on the app to get faster performance by setting this
number higher (for example to 3) or lower (to 1) if you want Docker
for Mac to use less memory.
Those instructions are referring to the Preferences dialog.

Resources