ppo doesn't optimize MountainCarContinuous-v0
aliamiri1380 opened this issue · 0 comments
aliamiri1380 commented
Hi,
the PPO implementation was perfect but unfortunately doesn't work with MountainCarContinuous-v0
do you know any solution to the problem?