How do we port an existing multi-agent leanring algorithm such as IDDPG, IPPO?
kailashg26 opened this issue · 4 comments
Hello,
Thank you for a very interesting work. I was trying to understand how to run the IDDPG algorithm with the MACAD environment. Do you know how to customize it? May I know the setup of algorithms you already support in multi-agent settings, mostly independent learners?
Thanks
Hi @kailashg26 ,
Thank you for your interest in this work!
You can use any suitable multi-agent RL algorithm including the IDDPG by importing the MACAG-Gym environments like any other RL env. Please see the MACAD-Agents repository for multi-agent training code examples that has full examples for Multi-Agent PPO and multi-agent IMPALA including independent learners that you are looking for.
Customization:
You can customize the existing multi-agent environments or create your own!
-
To customize existing environments, you can edit the environment configuration to change the environment or actor parameters. For example, to customize the
UrbanSignalIntersection3Car
environment you can edit/modify these configuration values:
-
To create your own environments, you can follow this Wiki Page on Extending MACAD-Gym with new learning environments
Consider creating a Pull Request to integrate the environment once you have ready or if you need inputs as you develop your custom environment.
Thanks for the help!
I was able to setup the CARLA simulator. However, when I run the code, I get this error:
(macad-gym) kailash@SEAS15986:~/Desktop/macad-agents/src$ python -m macad_agents.rllib.ppo_multiagent_shared_weights
/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:526: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:527: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:528: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:529: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:530: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/tensorflow/python/framework/dtypes.py:535: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/gymnasium/core.py:27: UserWarning: WARN: Gymnasium minimally supports python 3.6 as the python foundation not longer supports the version, please update your version to 3.7+
"Gymnasium minimally supports python 3.6 as the python foundation not longer supports the version, please update your version to 3.7+"
Traceback (most recent call last):
File "/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"main", mod_spec)
File "/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/kailash/Desktop/macad-agents/src/macad_agents/rllib/ppo_multiagent_shared_weights.py", line 11, in
from macad_agents.rllib.models import register_mnih15_shared_weights_net
File "/home/kailash/Desktop/macad-agents/src/macad_agents/rllib/models.py", line 10, in
from ray.rllib.models.catalog import ModelCatalog
File "/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/ray/rllib/init.py", line 7, in
from ray.rllib.env.base_env import BaseEnv
File "/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/ray/rllib/env/init.py", line 1, in
from ray.rllib.env.base_env import BaseEnv
File "/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/ray/rllib/env/base_env.py", line 6, in
from ray.rllib.utils.annotations import Deprecated, DeveloperAPI, PublicAPI
File "/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/ray/rllib/utils/init.py", line 8, in
from ray.rllib.utils.filter import Filter
File "/home/kailash/miniconda3/envs/macad-gym/lib/python3.6/site-packages/ray/rllib/utils/filter.py", line 5, in
import tree # pip install dm_tree
ModuleNotFoundError: No module named 'tree'
I tried to install the tree package, but it always says that the building has failed. My python version is 3.6.7. Please let me if I'm missing anything.
Hi @kailashg26 ,
Good to see you have setup CARLA Sim with macad-agents. From the error log you shared, it looks like your conda python environment is not setup with the prerequisites properly.
I tried to install the tree package, but it always says that the building has failed.
Did you install the dm-tree package or the tree
package? Ray/RLLib requires the former (dm-tree
). It is listed in Ray's dependency via setup.py but given you error log, you likely don't have it installed in that conda python environment you are using.
Please try the following:
- Just to be sure, in case you installed it by mistake, uninstall
tree
:pip uninstall tree
- Install
dm-tree
:pip install dm-tree
.
And then, try again. Hopefully that's the only package that went missing in your python environment.
If you run into other ModuleNotFoundError
, please consider setting up the conda environment using this spec OR better use a Docker container based on these instructions
Hi @kailashg26, I hope the above resolved your issue. Closing this. Please feel free to re-open or start a new discussion if you have follow-ups.