Deep Learning

If you are involved in deep learning at Imperial you will know the difficulty in installing and configuring all the drivers, libraries and packages on your computer.

To promote and aid a standard deep learning environment we recommend adopting Docker and the Deepo image.

It contains most popular deep learning frameworks: theano, tensorflow, sonnet, pytorch, keras, lasagne, mxnet, cntk, chainer, caffe, torch.

Quick Start

Step 1. Install Docker and nvidia-docker.

Step 2. Obtain the Deepo image

Get the image from Docker Hub (recommended)
docker pull ufoym/deepo

Usage

Now you can try this command:

nvidia-docker run --rm ufoym/deepo nvidia-smi

This should work and enables Deepo to use the GPU from inside a docker container. If this does not work, search the issues section on the nvidia-docker GitHub — many solutions are already documented. To get an interactive shell to a container that will not be automatically deleted after you exit do

nvidia-docker run -it ufoym/deepo bash

If you want to share your data and configurations between the host (your machine or VM) and the container in which you are using Deepo, use the -v option, e.g.

nvidia-docker run -it -v /host/data:/data -v /host/config:/config ufoym/deepo bash

This will make /host/data from the host visible as /data in the container, and /host/config as /config. Such isolation reduces the chances of your containerized experiments overwriting or using wrong data.