I think it’s important to point out that Universe never seemed to take off in the AI community. This may be due to the lag introduced by VNC. Universe also appears to be dependent on a much older version of OpenAI gym which leads me to beleive they may not be supporting it that much anymore. A good post that discusses these issues and an attempt to solve them can be found here.

In this tutorial I will be going through how to install OpenAI’s gym and Universe. The Universe starter agent requires TensorFlow and OpenCV to be installed and I highly recommend you follow this tutorial on how to setup an isolated python environment, install openCV and gain optimizations from building TensorFlow from sources.

Install Docker

Docker is an open-source project that automates the deployment of applications inside software containers. It is also used by Open AI’s Universe.

Start by:

sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \

For Ubuntu 14.04:

sudo apt-get install \
linux-image-extra-$(uname -r) \

Followed by:

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - && \
sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) \

And to finish:

sudo apt-get update && \
sudo apt-get install docker-ce

And test installation by:

sudo service docker start && \
sudo docker run hello-world

You should see a message Hello from Docker! informing you that your installation appears correct.

To make it so you don’t have to use sudo to use docker you can:

sudo groupadd docker && \
sudo usermod -aG docker $USER
sudo reboot
sudo groupadd docker && \
sudo gpasswd -a ${USER} docker && \
sudo service docker restart   
sudo reboot

Install OpenAI’s Gym & Universe

We can now get started with installing OpenAI’s Gym and Universe.

pip install universe

We can also clone Open AI’s starter agent which will train an agent using the A3C Algorithim.

cd ~ && git clone https://github.com/openai/universe-starter-agent.git && \
cd ~/universe-starter-agent && \
python train.py --num-workers 4 --env-id PongDeterministic-v0 --log-dir /tmp/vncpong --visualise