thibo73800/metacar

Import metacar environment to python

HsnAtwi opened this issue · 2 comments

Hello,
I'm working on a python project with Reinforcement Learning, and I found the Metacar project very useful..

The thing is that I'm still new to this domain, and I don't know javascript.
So how would I import this environment to my Python IDE?

Any help would be great

I think the proper way would be: Wrapping the Javascript into some Python code. At the same time making it compatible to OpenAI Gym.