/xroute_env

RL environment for detailed routing.

Primary LanguagePythonBSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

XRoute Environment

XRoute Environment, standing for self-learning (denoted by X) for detailed routing (denoted by Route), is a novel reinforcement learning environment to train agents to order and route all nets in various challenging testcases efficiently and effectively, and present the routing results in varying dashboards.

Quickstart

Installation

To interact with the xroute environment, you need to download the simulator first:

Operating System Download Link
Ubuntu 22.04 Download

Then, put the simulator in the third_party/openroad folder.

You may also need to execute the following command to install some libraries to ensure that OpenRoad can start up properly.

cd third_party/openroad
chmod +x DependencyInstaller.sh
source ./DependencyInstaller.sh

Agent Introduction

DQN

PPO

Launch Mode

You can choose to launch the simulator in following modes:

Training Mode

In this mode, the simulator should launch first, then the agent can control the simulator to train the model.

cd examples && python3 launch_training.py

cd baseline/DQN && python3 train_DQN.py cpu
# cd baseline/PPO && python3 train_PPO.py cpu

After executing the command above, the simulator will listen to the port 6667 to wait for environment reset command, and then interact with the agent via port 5556.

Evaluation Mode

In this mode, the agent should launch first, then the simulator can connect to the agent to get the action.

cd baseline/DQN && python3 test_DQN.py cpu 5556
# cd baseline/PPO && python3 test_PPO.py cpu 5556

cd examples && python3 launch_evaluation.py 5556

TODO List

  • Auto download ispd testcases
  • Support distributed routing on one server

Acknowledgement

The routing simulator in xroute environment is mainly based on OpenROAD TritonRoute. Thanks for their wonderful work!