Chrome Dino Reinforcement Learning Save

An RL implementation in Keras

Project README

Chrome-Dino-Reinforcement-Learning

NOTE: This repo is the basic implementation with few limitations. Please refer the new repo at https://github.com/Paperspace/DinoRunTutorial

https://blog.paperspace.com/dino-run/

A Deep Convolutional Neural Network to play Google Chrome's offline Dino Run game by learning action patterns from visual input using a model-less Reinforcement Learning Algorithm

NOTE: This is a basic-implementation repository with some limitations. Please refer https://github.com/Paperspace/DinoRunTutorial where I've used a GPU VM for better results, with scores upto 4000

Refer the jupyter notebook for detailed implementation :
https://github.com/ravi72munde/Chrome-Dino-Reinforcement-Learning/blob/master/Reinforcement%20Learning%20Dino%20Run.ipynb

Installation

Start by cloning the repository

$ git clone https://github.com/ravi72munde/Chrome-Dino-Reinforcement-Learning.git
Dependencies can be installed using pip install or conda install for Anaconda environment

Dependencies

  • Python 3.6
  • Selenium
  • OpenCV
  • PIL
  • Keras
  • Chromium driver for Selenium

gif

Sample Game Play

https://youtu.be/0oOOqGFmlDs

Open Source Agenda is not affiliated with "Chrome Dino Reinforcement Learning" Project. README Source: ravi72munde/Chrome-Dino-Reinforcement-Learning

Open Source Agenda Badge

Open Source Agenda Rating