Import metacar environment to python
HsnAtwi opened this issue · 2 comments
HsnAtwi commented
Hello,
I'm working on a python project with Reinforcement Learning, and I found the Metacar project very useful..
The thing is that I'm still new to this domain, and I don't know javascript.
So how would I import this environment to my Python IDE?
Any help would be great
AI-Guru commented
I think the proper way would be: Wrapping the Javascript into some Python code. At the same time making it compatible to OpenAI Gym.
AI-Guru commented
Current WIP: https://github.com/AI-Guru/gym-metacar