ppo-LunarLander-v2 / DeepRL-Unit_1-LunarLander /_stable_baselines3_version
osman93's picture
The initial commit for LunarLander task using PPO.
9db50fa
1.6.2