How to run a jupyter notebook at a particular folder in docker - docker

I have set up jupyter notebook in the correct port in docker , everytime I need to upload data into the notebook and do analysis , is there anyway I can get up my jupyter file location to a particular file ,keeping in mind I'm using docker.

You need to volume your folder to the Docker container. An example that you use jupyter/all-spark-notebook image, so you can run:
docker run -it --rm -p 8888:8888 -p 4040:4040 -v your-path:/home/jovyan/workspace jupyter/all-spark-notebook
Update your-path to the path contains your notebooks

Related

Docker: error response from daemon: invalid mode: /tf

I'm new to using docker and my objective is to bind mount a docker image to a file path on my host machine (shown in the below directory) so I can:
Run a Jupyter Notebook instance without losing the data every time I end my terminal session
Link my Jupyter Notebook to the same path where my training data resides
I have tried at looking at many threads on the topic to little avail. I run the command shown below and am using Linux Mint:
sudo docker run -it --rm --gpus all -v "$(pwd):/media/hossamantarkorin/Laptop Data II/1- Educational/ML Training/Incident Detection/I75_I95 RITIS":"/tf" -p 8888:8888 tensorflow/tensorflow:2.3.0rc1-gpu-jupyter
What am I doing wrong here?
Thanks,
Hossam
This usually happens when docker is not running.
Try sudo service docker start before entering your command.
I just wanted to provide an update on this. The easiest way to work on your local directory is to:
Do a change directory to where you want to work
Run your docker while bind mounting to your pwd:
sudo docker run -it --rm --gpus all -v "$(pwd):/tf" -p 8888:8888 tensorflow/tensorflow:2.3.0rc1-gpu-jupyter

How to save and edit a Jupyter notebook in a host directory using official Tensorflow docker container?

I want to use the official Tensorflow docker images to create and edit a Jupyter notebook stored on the host.
I'm a little confused with what switches I need to provide. To run a Tensorflow script on the host the docs suggest:
docker run -it --rm -v $PWD:/tmp -w /tmp tensorflow/tensorflow python ./script.py
..and to run the Jupyter service:
docker run -it -p 8888:8888 tensorflow/tensorflow:nightly-py3-jupyter
When I try merging the switches to run Jupyter + mount the host volume:
docker run -it --rm -v $PWD:/tmp -w /tmp -p 8888:8888 tensorflow/tensorflow:nightly-py3-jupyter
...its still accessing notebooks stored in the container, not the host.
Notebooks are stored inside the container /tf folder, so copying your files there will do the trick:
docker run -it --rm -v $PWD:/tf -p 8888:8888 tensorflow/tensorflow:nightly-py3-jupyter
The first command you mentioned is used to run a TensorFlow program developed on the host machine, not a notebook.

Show volume files in the GUI of Docker Jupyter notebook

I run Jupyter Notebook with Docker and trying to mount local directory onto the intended Docker volume. But I am unable to see my files in the Jupyter notebook. The Docker command is
sudo nvidia-docker create -v ~/tf/src -it -p 8888:8888
-e PASSWORD=password
--name container_name gcr.io/tensorflow/tensorflow:latest-gpu
and the GUI of the Jupyter Notebook looks like
but ~/tf/src are not shown up in the Jupyter GUI.
What are needed for the files to shown up in the Jupyter? Am I initializing the container incorrectly for this?
the way you mount your volume i think its incorrect -v ~/tf/src it should be
-v /host/directory:/container/directory
Ferdi D's answer targets only files inside interpreter, not precisely files inside Jupyter GUI, which makes things a little bit confusing. I target the title Show volume files in docker jupyter notebook by more generally showing the filse inside Jupyter notebook.
Files inside the interpreters
The -v flag gets you the files in the interpreter or the notebook but not necessarily in the Jupyter GUI
for which you run
$ docker run --rm -it -p 6780:8888 -v "$PWD":/home/jovyan/ jupyter/r-notebook
because the mount point depends on a distribution and hence its path. Here, you ask your current directory to be mounted to Jupyter's path /home/jovyan.
Files inside Jupyter GUIs
To get the files in Jupyter GUI:
OS X
If you had some other than /home/jovyan at the current Jupyter version, the files would not appear in Jupyter GUI so use
$ docker run --rm -it -p 6780:8888 -v "$PWD":/home/jovyan/ jupyter/r-notebook
Some other distros
$ docker run --rm -it -p 6780:8888 -v "$PWD":/tmp jupyter/r-notebook
More generally
For checking up for /home/jovyan/ or /tmp, you can getwd() in R to see your working directory.
Further threads
Reddit discussion more generally on the topic here
Posting this as an answer since the location seems to have changed and the accepted answer doesn't spell it out in full how to get your local directory to show up in Tensorflow Jupyter (Type this on one line with an appropriate <localdir> and <dockerdir>):
docker run --runtime=nvidia -it
--name tensorflow
-p 8888:8888
-v ~/<localdir>:/tf/<dockerdir>
tensorflow/tensorflow:nightly-jupyter
Karl L thinks the solution is the following below. The solution moved here for everyone to judge it and make the question easier to read.
Solution
sudo nvidia-docker create -v /Users/user/tf/src:/notebooks
-it -p 8888:8888 -e PASSWORD=password
--name container_name gcr.io/tensorflow/tensorflow:latest-gpu
As #fendi-d pointed out I was mounting my volume incorrectly.
Then I was pointed to the incorrect mounting dir and I found the correct one in the tensorflow docker file
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/tools/dockerfiles/dockerfiles/gpu.Dockerfile
Which configures the jupyter notebook and then copies files to "/notebooks"
# Set up our notebook config.
COPY jupyter_notebook_config.py /root/.jupyter/
# Copy sample notebooks.
COPY notebooks /notebooks
After I ran with the correct mounting path it showed my files located in "/Users/user/tf/src"

access data using docker jupyter notebooks with gpu support on google cloud

I have followed link :
https://blog.paperspace.com/jupyter-notebook-with-a-gpu-the-easy-way/
now my data in cloud is in folder ~/projects/tf-example/data , how do i access this data in my jupyter notebook ?
I saw some option to mount volume using docker cmd,but still not sure to access from notebook ?
got it,followed this link :
https://www.dataquest.io/blog/docker-data-science/
and it worked .
use this cmd,lets say locally ur files is in location ~/projects:
sudo nvidia-docker run --rm --name tf-notebook -p 8888:8888 -p 6006:6006 -v ~/projects:/notebooks/ gcr.io/tensorflow/tensorflow:latest-gpu jupyter notebook --allow-root

How to access Docker (with Spark) file systems

Suppose I am running CentOS. I installed docker, then run the image.
Suppose I use this image:
https://github.com/jupyter/docker-stacks/tree/master/pyspark-notebook
Then I run
docker run -it --rm -p 8888:8888 jupyter/pyspark-notebook
Now, I can open the browser with localhost:8088 and I can create a new Jupyter notebook, type code and run, etc.
However, how can I access the file I created and, for example, commit it to github. Furthermore, if I already have some code on github, how can I pull this code and access these code from docker?
Thank you very much,
You need to mount the volume
docker run -it --rm -p 8888:8888 -v /opt/pyspark-notebook:/home/jovyan jupyter/pyspark-notebook
You should have just executed !pwd in the a new notebook and found which folder it was storing the work in. And then mounted that as a volume. When you run it like above the files would be available on your host in /opt/pyspark-notebook

Resources