This repository contains the implementation of Gymnasium environment for the Flappy Bird game. The implementation of the game's logic and graphics was based on the flappy-bird-gym project, by @Talendar.
The "FlappyBird-v0" environment, yields simple numerical information about the game's state as observations or RGB-arrays (images) representing the game's screen.
-
the last pipe's horizontal position
-
the last top pipe's vertical position
-
the last bottom pipe's vertical position
-
the next pipe's horizontal position
-
the next top pipe's vertical position
-
the next bottom pipe's vertical position
-
the next next pipe's horizontal position
-
the next next top pipe's vertical position
-
the next next bottom pipe's vertical position
-
player's vertical position
-
player's vertical velocity
-
player's rotation
-
or RGB-array (image) representing the game's screen
- 0 - do nothing
- 1 - flap
- +0.1 - every frame it stays alive
- +1.0 - successfully passing a pipe
- -1.0 - dying
To install flappy-bird-gymnasium
, simply run the following command:
$ pip install flappy-bird-gymnasium
Like with other gymnasium
environments, it's very easy to use flappy-bird-gymnasium
.
Simply import the package and create the environment with the make
function.
Take a look at the sample code below:
import flappy_bird_gymnasium
import gymnasium
env = gymnasium.make("FlappyBird-v0", render_mode="human")
obs, _ = env.reset()
while True:
# Next action:
# (feed the observation to your agent here)
action = env.action_space.sample()
# Processing:
obs, reward, terminated, _, info = env.step(action)
# Checking if the player is still alive
if terminated:
break
env.close()
To play the game (human mode), run the following command:
$ flappy_bird_gymnasium
To see a random agent playing, add an argument to the command:
$ flappy_bird_gymnasium --mode random
To see a Deep Q Network agent playing, add an argument to the command:
$ flappy_bird_gymnasium --mode dqn