Running a docker container via Jupiter Notebook - docker

I've searched for a number of hours to find the answer to this because I want to respect the idea of non-redundant posts. I'm sure the fact that I'm relatively new to programming doesn't help but here's my issue (hopefully I present it in the correct way):
I am taking a Computational Finance class that requires the package 'qstk'. There is a docker image setup with everything needed for the course, under the docker file 'ruippeixotog/qstk'. I followed the following instructions to access this docker file via the Jupiter notebook and was successful but ever since that first use of the file, I can no longer access the notebook.
I have the current version of Docker downloaded and follow the same instructions to access the notebook via a web browser (I've tried Safari, Chrome, and Firefox) but always get the error code "
install docker whatever OS you have (it officially supports Windows,
Mac, Linux) by following the instructions on the Docker website start
it, still following the instructions on the website
List item
execute on command line: docker run -dt -p 8888:8888 --name qstk
ruippeixotog/qstk
open your browser at http://localhost:8888 and you have a python
interpreter with QSTK fully configured
The source repository is at: https://github.com/ruippeixotog/docker-qstk
My process:
I run docker with: docker run -dt -p 8888:8888 --name qstk ruippeixotog/qstk
I go to http://localhost:8888
And I get the response from the browser: "Failed to open page...server unexpectedly dropped the connection"
If I run:
I run docker with: docker run -dt -p 8888:8888 --name qstk ruippeixotog/qstk
jupyter notebook
It brings me to my tree of file folders (localhost:8888/tree), where the two files
"QSTK-0.2.5.tar.gz" and
"QSTK-0.2.5.tar.gz.cpgz"
Are located, along with a bunch of my other folders. When I try to edit the first of those two files, it opens a new tab and tells me "SSL is required." within the jupyter editor page for that file, showing nothing else
When I try to edit the second, it shows only "Error! /Users/.../QSTK-0.2.8
Can anyone help with this?

Related

Access Link Generated Inside a Docker Container

I started a docker container with the following command:
docker run -i -p 8989:8989 -p 8888:8888 -v /Users/sroche/FISHDATA/tiff_stacks:/home/neuroglancer_user/test_data -t openmind/neuroglancer:1.0 jupyter notebook --ip='0.0.0.0'
This creates a container running jupyter notebook and neuroglancer. These specifics are not too important, but may be helpful.
I have a notebook inside the container that generates a link, and this link is supposed to allow me to use my browser to view images at the address in the link. However, I am not allowed from connecting to the link, with the response:
(I am simply putting the link into a Chrome browser)
The link I am trying to access looks like:
http://127.0.0.1:42071/v/.../
How can I connect to a link that is generated within a Docker container?

run GUI app in docker container from XRDP client

I have two machines,
machine A: system ubuntu 20.04 (DISPLAY :0)
machine B: system windows 10
in machine A, I create a docker container with the following command:
docker run --rm -it -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=$DISPLAY -v $(pwd):/home/walid/notebooks opencv bash
the docker container has Visual Studio Code installed and OpenCV, I run the following command to go in the container bash:
docker exec -it $containerId bash
inside the container, I go into the folder notebooks and run the following commands:
code . # to launch the visual studio code
result; the visual studio code opens
python read.py # to open a video capture with OpenCV
result; good the video capture opens
now int the machine B; I use the Remote Desktop Connection to machine B, when I go inside the container to open Visual Studio Code nothings happing doesn't open, when I try python read.py I get an error could not connect to display :10.0
I figure out where I was doing it wrong:
(base) adminsst#admins:~$ xhost
access control enabled, only authorized clients can connect
SI:localuser:adminsst
(base) adminsst#admins:~$ sudo su
root#admins:/home/adminsst# xhost
No protocol specified
xhost: unable to open display ":10.0"
whene I was runing the container in the sudo su it doesn't work, but when I sudo docker run .... it works.
but now when I login to another user in the server (machine A) and do the same think it doesn't work anymore and I will get the same error could not connect to display :11.0
SOLUTION 1
here's the solution I did and it works for me:
first if you run the following command in the new user you will get:
walid#admins:~$ xhost
access control enabled, only authorized clients can connect
now run the this one again to have this result:
walid#admins:~$ xhost +local:
non-network local connections being added to access control list
to launch the GUI app you need to pass the new DISPLAY to the container because it will had been changed every time you login to solve this problem just run the following command:
walid#admins:~$ sudo docker exec -it -e DISPLAY=$DISPLAY open bash
that's all now you will be able to launch the GUI app without any problem, but still every time you login you should exect the following command xhost +local: before starting the container, if someone has a solution for that please let me know
SOLUTION 2
see the following url here
xhost +local:`docker inspect --format='{{ .Config.Hostname }}' $containerId`
docker start $containerId
there more useful solutions
--------
Another thing I have faced, that even when I do the above steps, sometimes doesn't work, I simply change the display ID in the docker container to the same as Local machine ID by the following;
first figure out the local machine display ID by
echo $DISPLAY
then in the docker container do the following
$DISPLAY=:ID

Unable to find Jupyter notebook via Docker

I am very new to docker and Jupyter notebook. I pulled the image from docker, it was able to direct me to the relevant Jupyter notebook. Problem is, whatever plots I am making in the notebook, I am not able to find the file in the system. A file with the name settings.cmnd should be made on my system. I am using Windows 10 home version. I am using the following command
docker run -it -v "//c/Users/AB/project":"//c/program files/Docker Toolbox" -p 8888:8888/tcp CONTAINER NAME
It is running fine as I am able to access the jupyter notebook but the file is still missing on my system.
Here the folder in which I want to save file is project
Kindly help.
I did not find an image called electronioncollider/pythiatutorial, so I'm assuming you meant electronioncollider/pythia-eic-tutorial.
Default working directory for that image is /code so the command on Windows should look like:
docker run --rm -v //c/Users/AB/project://code -p 8888:8888 electronioncollider/pythia-eic-tutorial:latest
Working dierctory can be changed with -w, so the following should work as well:
docker run --rm -w //whatever -v //c/Users/AB/project://whatever -p 8888:8888 electronioncollider/pythia-eic-tutorial:latest
Edit:
electronioncollider/pythia-eic-tutorial:latest image has only one version - one that is meant to run on linux/amd64. This means it's meant to run on 64-bit Linux installed on a computer with Intel or AMD processor.
You're not running it on Windows, but on a Linux VM that runs on your Windows host. Docker can access C:\Users\AB\project, because it's mounted inside the VM as c/Users/AB/project (although most likely it's actuall C:\Users that's mounted as /c/Users). Therein lies the problem - Windows and Linux permission models are incompatible, so the Windows directory is mounted with fixed permissions that allows all Linux users access. Docker then mounts that directory inside the container with the same permissions. Unfortunately Jupyter wants some of the files it creates to have a very specific set of permissions (for security reasons). Since the permissions are fixed to a specific value, Jupyter cannot change them and breaks.
There are two possible solutions
Get inside whatever VM the Docker is running inside, change directory to one not mounted from Windows, and run the container from there using the command from the tutorial/README:
docker run --rm -u `id -u $USER` -v $PWD:$PWD -w $PWD -p 8888:8888 electronioncollider/pythia-eic-tutorial:latest
and the files will appear in the directory that the command is run from.
Use the modified image I created:
docker run --rm -v //c/Users/AB/project://code -p 8888:8888 forinil/pythia-eic-tutorial:latest
You can find the image on Docker Hub here. The source code is available on GitHub here.
Edit:
Due to changes in my version of the image the proper command for it would be:
docker run -it --rm -v //c/Users/AB/project://code --entrypoint rivet forinil/pythia-eic-tutorial
I release a new version, so if you run docker pull forinil/pythia-eic-tutorial:latest, you'll be able to use both the command above, as well as:
docker run -it --rm -v //c/Users/AB/project://code forinil/pythia-eic-tutorial rivet
That being said I did not receive any permission errors while testing either the old or the new versions of the image.
I hope you understand that due to how Docker Toolbox works, you won't be able to use aliases the way the tutorial says you would on Linux.
For one thing, you'll only have access to files inside directory C:\Users\AB\project, for another file path inside the container will be different than outside the container, eg. file C:\Users\AB\project\notebooks\pythiaRivet.ipynb will be available inside the container as /code/notebooks/pythiaRivet.ipynb
Note on asking questions:
You've got banned from asking questions, because your questions are low quality. Please read the guidelines before asking any more.

Connecting Spyder to Remote Jupyter Notebook in a Docker Container

I have been trying to connect Spyder to a docker container running on a remote server and failing time and again. Here is a quick diagram of what I am trying to achieve:
Currently I am launching the docker container on the remote machine through ssh with
docker run --runtime=nvidia -it --rm --shm-size=2g -v /home/timo/storage:/storage -v /etc/passwd:/etc/passwd -v /etc/group:/etc/group --ulimit memlock=-1 -p 8888:8888 --ipc=host ufoym/deepo:all-jupyter
so I am forwarding on port 8888. Then inside the docker container I am running
jupyter notebook --no-browser --ip=0.0.0.0 --port=8888 --allow-root --notebook-dir='/storage'
OK, now for the Spyder part - As per the instructions here, I go to ~/.local/share/jupyter/runtime, where I find the following files:
kernel-ada17ae4-e8c3-4e17-9f8f-1c029c56b4f0.json nbserver-11-open.html nbserver-21-open.html notebook_cookie_secret
kernel-e81bc397-05b5-4710-89b6-2aa2adab5f9c.json nbserver-11.json nbserver-21.json
Not knowing which one to take, I copy them all to my local machine.
I now go to Consoles->Connect to an Existing Kernel, which gives me the "Connect to an Existing Kernel" window which I fill out as so (of course using my actual remote IP address):
(here I have chosen the first of the json files for Connection info:). I hit enter and Spyder goes dark and crashes.
This happens regardless of which connection info file I choose. So, my questions are:
1: Am I doing all of this correctly? I have found lots of instructions for how to connect to remote servers, but not so far for specifically connecting to a jupyter notebook on a docker on a remote server.
2: If yes, then what else can I do to troubleshoot the issues I am encountering?
I should also note that I have no problems connecting to the Jupyter Notebook through the browser on my local machine. It's just that I would prefer to be working with Spyder as my IDE.
Many thanks in advance!
This isn't a solution so much as a work around, but sshfs might be of help
Use sshfs to mount the remote machine's home directory on a local directory, then your local copy of Spyder can edit the file as if it were a local file.
sshfs remotehost.com:/home/user/ ./remote-host/
It typically takes about half a second to upload the changes to an AWS host when you I hit save in Spyder, which is an acceptable delay for me. When it's time to run the code, ssh into the remote machine, and run the code from an IPython shell. It's not elegant, but it does work.
I'm not expecting this to be the best answer, but maybe you can use it as a stopgap solution.
I have the same problem with you. I got it working, maybe a bit clumsy as I am totally new to docker. Here are my steps and notes on where we differ, hope this helps:
Launch docker conatiner in remote machine:
docker run --gpus all --rm -ti --net=host -v /my_storage/data:/home/data -v /my_storage/JSON:/root/.local/share/jupyter/runtime repo/tensorflow:20.03-tf2-py3
I use a second volume mount, in order to get kernel.json file to my local computer. I couldn't manage to access directly from docker via ssh, as it is in /root/ folder in docker container, and with root-only access. If you know how to read from there directly, I'll be happy to learn. My workaround is:
On remote machine, create a JSON/ directory, and map it to the "jupyter --runtime-dir" in container. Once the kernel is created, access the kernel-xxx.json file through this volume mount, copy to local machine and chmod.
Launch ipython kernel in container:
ipython kernel
You are launching jupyter notebook. I suspect this is the reason for your problem. I am not sure if spyder works on notebooks, but it works on iPython kernels. Probably, it works better on spyder-kernels.
copy kernel.json file from /remote_machine/JSON to local machine, chmod for accessing.
launch spyder, use local kernel.json and ssh settings. This part is same as yours.
Not enough reputation... to add comment but to chime on #asim's solution. I was able to have my locally installed Spyder to connect to a kernel running from a container on a remote machine. There is bit of manual work but I am okay with this since I can get much more done with Spyder than with other IDEs.
docker run --rm -it --net=host -v /project_directory_remote_machine:/container_project_directory image_id bash
from container
python -m spyder_kernels.console - matplotlib=’inline’ --ip=127.0.0.1 -f=/container_project_directory/connection_file.json
from remote machine, chmod connection_file.json to open then open and copy/paste content to a file on a local machine :) Use the json file to connect to a remote kernel following steps in the sources below
https://medium.com/#halmubarak/connecting-spyder-ide-to-a-remote-ipython-kernel-25a322f2b2be
https://mazzine.medium.com/how-to-connect-your-spyder-ide-to-an-external-ipython-kernel-with-ssh-putty-tunnel-e1c679e44154

Input file not found in docker command on windows

Complete docker noob here, i installed docker desktop on windows - Trying to follow the commands on this link to setup OSRM backend on my machine. i've downloaded the dataset for india(india-latest.osm.pbf) to D:/docker
and am running the commands from that location
docker run -t -v "${PWD}:/data" osrm/osrm-backend osrm-extract -p /opt/car.lua /data/india-latest.osm.pbf
fails with
[error] Input file /data/india-latest.osm.pbf not found!
i just don't understand WHY it doesn't work. according to osrm documentation of the docker command -
The file /data/india-latest.osm.pbf inside the container is referring
to "${PWD}/india-latest.osm.pbf" on the host.
but it's not the case,i am running from d:/docker so it should find india-latest.osm.pbf no problem. This is really really confusing to me even though it must be so basic
it was due to a bug in docker https://github.com/docker/for-win/issues/1712
when you change password it silently fails for commands that access the host filesystem on windows until you reauthenticate

Resources