ppo-Sokoban-v0 / README.md
zhangpaipai's picture
Added Sokoban-v0 model trained with PPO
1f588f6
metadata
{}