/offline-marl-framework-offpymarl

Benchmarked implementations of Offline Multi-Agent RL Algorithms based on PyMARL codebase.

Primary LanguagePython

Offline MARL framework - OffPyMARL

🚧 This repo is not ready for release, benchmarking is ongoing. 🚧

OffPyMARL provides unofficial and benchmarked PyTorch implementations for selected Offline MARL algorithms, including:

we also implement selected Multi-Task versions to tackle with the population-invariante issue for BC, QMIX+CQL and MATD3+BC.

Installation

conda create -n offpymarl python=3.10 -y
conda activate offpymarl
pip install -r requirements.txt
bash install_sc2.sh # if you have not installed StarCraftII on your machine
bash install_smac.sh

Collect Data

python src/main.py --collect --config=<alg> --env-config=sc2_collect --map_name=<map_name> --offline_data_quality=<quality> --save_replay_buffer=<whether_to_save_replay>
--num_episodes_collected=<num_episodes_per_collection> --stop_winrate=<stop_winrate> --seed=<seed>

quality is optinal in ['random', 'medium', 'expert', 'full'].

if save_replay_buffer is set, it will generate 'medium_replay', 'expert_replay' offline data with adequate offline_data_quality param.

Offline Dataset

We provide the small-scale datasets (less than 4k episodes) in Google Drive for a quick start. After placing the full dataset in the dataset folder, you can run experiments using our predefined task sets.

Additionally, we now support the use of OG-MARL datasets. To integrate this with the (off)pymarl pipeline, we have transformed it into an H5 file as demonstrated in src/transform_data.ipynb (please refer to this file for details).

Benchmarking on OG-MARL is currently in progress...

Offline Training

python src/main.py --offline --config=<alg_name> --env-config=sc2_offline --map_name=<sc2_map>  --offline_data_quality=<data_quality> --seed=<seed> --t_max=40000 --test_interval=250 --log_interval=250 --runner_log_interval=250 --learner_log_interval=250 --save_model_interval=100001 

see test/offline_train for more information.

Citing OffPyMARL

If you use OffPyMARL your work, please use the following bibtex

@misc{OffPyMARL,
  author = {Ziqian Zhang},
  title = {OffPyMARL: Benchmarked Implementations of Offline Reinforcement Learning Algorithms},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/zzq-bot/offline-marl-framwork-offpymarl}},
}

Acknowledgements

We thank ODIS for providing data collection related code and EPyMARL for providing MADDPG related code.