Jupyter lab memory problem "Error code: SBOX_FATAL_MEMORY_EXCEEDED" - memory

Jupyter lab keeps crashing with this error "Error code: SBOX_FATAL_MEMORY_EXCEEDED". top -u is showing an unusual usage of ~RAM 4.7 GB. There is a similar intense memory usage in chrome task manager also. I'm not running any memory consuming calculation. The Notebook is running from a Linux server. The error doesn't seem to be notebook specific. I tried with an empty notebook. It is also not browser specific, trieg Edge and Firefox.

Error Message in jupyter lab :
"Error code: SBOX_FATAL_MEMORY_EXCEEDED"
My Action:
Opening a big CSV file with the extension jupyter-tabnine
After seconds of loading, it turns to give the warning on the top
Badthing:
And the jupyter lab can remember the oppend windows(the big CSV)
so restart the jupyter lab will continue the Error
Solution:
while CSV_page:
restart jupyter lab
click the `x` of CSV_page
if CSV_page.is_shutdown():
CSV_page = false

The problem seemed to be linked to a jupyter lab extension jupyter-tabnine. Everything works fine uninstalling.

Related

Problem understanding how to, if at all possible, run my docker file (.tar)

I received a .tar docker file from a friend that told me that it should contain all dependences for a program that I've been struggling to get working and that all I need to do is "run" the Docker file. The Docker file is of a .tar format and is around 3.1 GB. The program this file was setup to run is call opensimrt. The GitHub link to the file is as follows:
https://github.com/mitkof6/OpenSimRT
The google drive link to the Docker file is as follows:
https://drive.google.com/file/d/1M-5RnnBKGzaoSB4MCktzsceU4tWCCr3j/view?usp=sharing
This program has many dependencies, some big ones to note is that it runs off ubuntu 18.04 and Opensim 4.1.
I'm not a computer scientist by any means, so I've been struggling to even learn to do docker basics like load and run a image. However, I desperately need this program to work. If you have any steps or advice on how to run this .tar I'd greatly appreciate it. Alternatively if you are able to find a way to get opensimrt up and running and can post those steps I'd be more than happy with that solution as well.
I've tried the commands "docker run" and "docker load" followed by their respective tags, file paths, args..etc. However, even when I fix various issues I always get stuck with a missing var/lib/docker/tmp/docker-import-....(random numbers) file. The numbers change every so often when trying to solve the issue, but eventually I always end up getting some variation of this error: Error response from daemon: open /var/lib/docker/tmp/docker-import-3640220538/bin/json: no such file or directory.
ps: I have extracted the .tar already and there is no install guide/instruction, .exe, install application. As a result I'm not sure how to get the program installed and running.

How to turn on headed mode in Playwright when headless=False doesnt work?

Problem Context
I'm trying to launch Playwright in headed mode in Python.
Despite setting headless=False I still can't launch a browser
Existing Code
from playwright.sync_api import sync_playwright
with sync_playwright() as p:
browser = p.chromium.launch(headless=False, slow_mo=100)
page = browser.new_page()
page.goto("https://www.google.com/")
Output
[Done] exited with code=0 in 0.686 seconds
Debugging Methods I've tried already
1)Going to terminal "$env:HEADED=1"
2)Going to terminal "$env:PWDEBUG=1"
3)Using firefox and webkit instead of chromium
4)Using a Pycharm setup instead of Microsoft Visual Studio Code
I'd appreciate any thoughts of alternative solutions please.
Thanks in advance
try adding --headed in running test command line: npx playwright test --headed

Unable to launch new dask SLURM cluster from jupyter lab - "No module named 'dask_jobqueue"

I am trying to launch a new SLURM cluster using the Dask extension in JupyterLab. I am encountering the following pop-up when I click on the '+New' button:
Cluster Start Error
No module named 'dask_jobqueue'
This is despite having a labextension.yaml file in ~/.config/dask with a module for 'dask_jobqueue' included. This screenshot shows the issue, as well as my config file in the background:
Am I misunderstanding something?
Yes, the module has to be installed with:
conda install -c conda-forge dask-jobqueue
The image you had was showing a reference to the module (but it was not installed).

Error Starting Kernel: '_xsrf' argument missing from POST

I donĀ“t know if this error was caused maybe because I deleted one environment from my anaconda navigator, but every time I try starting jupyter lab (from various environments in anaconda or from the default python installed on my computer) I get this error with every single notebook:
Error Starting Kernel: '_xsrf' argument missing from POST
I have tried setting the following configuration to True:
c.NotebookApp.disable_check_xsrf = True
By generating the jupyter_notebook_config.py file from Pycharm. I also deleted the .jupyter folder but nothing of what I have done works.
I have the same problem. Reinstalled several time with different option but no success. Jupyter Notebook works just fine

Spyder 3.1.2 hangs at Python Console

When ask Spyder to run my program from the Python Console it gives me different responses. Sometimes, it doesn't run the program, other times it hangs. When it hangs I have to manual break the program with Ctrl+enter.
Have I loaded Spyder incorrectly? Do I need to configure my interface differently? Any suggestions?
(Spyder developer here) Because of this and several other bugs, the Python console is going to be removed in Spyder 3.2.
Please use the IPython console instead.

Resources