jupyter notebook memory leak? - memory

I start a new session of jupyter notebook, without loading anything. just a fresh session.
svmem(total=34093076480, available=22954254336, percent=32.7, used=11138822144, free=22954254336)
using psutil.virtual_memory() I caan see that even a fresh session is using 11G of memory.
any guidance how to cure this memory leak from Jupytper notebook please?
thanks.

Related

Memory usage of keycloak docker container

When we starts the keycloak container, it uses almost 700 MB memory right away. I was not able to find more details on how and where it is using this much memory. I have couple of questions below.
Is there a way to find more details about which processes are taking
more memory inside the container? I was looking into the file
/sys/fs/cgroup/memory/memory.stat inside the container which didn't give much info.
Is it normal for the keycloak container to use this much memory? Or we need
to do any tweaking in the configuration file for better performance.
I would also appreciate if anyone has more findings which can be leverage to improve overall performance of the application.
Keycloak is Java app, so you need to understand Java/Java VM memory footprint first: What is the memory footprint of the JVM and how can I minimize it?
If you want to analyze Java memory usage, then Java VisualVM is a good starting point.
700MB for Keycloak memory is normal. There is initiative to move Keycloak to Quarkus (https://www.keycloak.org/2020/12/first-keycloak-x-release.adoc), which will reduce also memory footprint - it is still in the preview, not generally available.
In theory you can switch to different runtime (e.g. GraalVM), but then you may have different issues - it isn't officialy supported setup.
IMHO: it'll be overengineering if you want to optimize your Keycloak memory usage; it is a Java app

Can you shut down your computer when your model is being trained on Google Cloud?

When you train a model via jupyter notebook in a Google Cloud instance, should you keep your computer on until for the model to finish training? All the computation is done in the cloud but still the code and the notebook is on your browser. So I was a bit curious.
Thanks a lot
This is actually more related on Jupyter behaviour than being run on a Google Cloud Instance.
If your process is not terminated in your VM instance, then the kernel is still active and although you close your browser, whatever you are running should still be running. You should be able to therefore access again the notebook and access all variables which had already being defined, however you cannot see any output that was printed to the notebook (if any). In case you need to close your window and want to log events you can see some of the suggestions in the following post:
Keep Jupyter notebook running after closing browser tab
Also this github issue thread could be useful

Docker 'killing' my program

I am running a data analysis code in docker using pandas on MacOS.
However, the program gets killed on high memory allocation in a data frame (I know because it gets killed when my program is loading a huge dataset).
Without the container, my program runs alright on my laptop.
Why is this happening and how can I change this?
Docker on MacOS is running inside a Linux VM which has an explicit memory allocation. From the docs:
MEMORY
By default, Docker for Mac is set to use 2 GB runtime memory,
allocated from the total available memory on your Mac. You can
increase the RAM on the app to get faster performance by setting this
number higher (for example to 3) or lower (to 1) if you want Docker
for Mac to use less memory.
Those instructions are referring to the Preferences dialog.

Digital Ocean server memory usage above 50%

I am deploying a Flask-based website on the server of Digital Ocean. And the website deployed is mainly static pages, config files and jsons.
This morning I found the memory usage has exceeded 51%. Here is the snapshot.
My memory is 512MB. Would someone please instruct me how to lower the memory usage? Thanks so much!
Update: I've use the "top" command in shell as suggested. Here is the snapshot, does it mean that it is the server itself eaten up those memories?
The memory issue is not related to my application.
I just received the answer from Digital Ocean. Here it is:
Hi there!
Thank you for contacting us! We can help with any memory issues you're having!
Since the Droplet is set up with only 512MB of RAM, once the system and any installed services start, it doesn't take much to push it past 50%. As a result, I don't think what you're seeing is necessarily abnormal under the circumstances. This leaves a few options: the Droplet can be resized and made larger to provide more memory (see https://www.digitalocean.com/community/tutorials/how-to-resize-your-droplets-on-digitalocean), you can add swap space to use part of the Droplet's file system as RAM (see https://www.digitalocean.com/community/tutorials/how-to-add-swap-on-ubuntu-14-04), or you can review the applications and services running on the Droplet and attempt to optimize them to reduce memory use.
We hope this is helpful! Please let us know if there is anything else we can do!
Regards,
I am assuming your are running a Linux server. If so, you can use the top command. It shows you all of the running processes and the system resources they are using. You would then be able to optimize from there.
I found out the cause! Linux borrows unused memory for disk caching. This makes it look like you are low on memory, but you are not! Everything is fine! If your application, or any other process needs more memory, Linux will automatically clear the cache and give memory for your application. Linux does this to speed up the system for you.
If, however, you find yourself needing to clear some RAM quickly to workaround another issue, like a VM misbehaving, you can force Linux to nondestructively drop caches using:
echo 3 | sudo tee /proc/sys/vm/drop_caches

Will creation of the full memory dump in process explorer slow down a windows service, whose memory I'm trying to dump?

Or same memory dump creating in Windows Task Manager. It is also interesting how exactly these programs create memory dumps..thanks in advance!

Resources