nric/CartpoleSimpleDQN
This is a simple implementation of a Deep Q Network learning agent tested on Open AI Gym's cart pole.
Jupyter Notebook
Stargazers
No one’s star this repository yet.
This is a simple implementation of a Deep Q Network learning agent tested on Open AI Gym's cart pole.
Jupyter Notebook
No one’s star this repository yet.