- Multiple network types (FCNN, HELM, MPDRNN)
- Hyperparameter tuning for optimal performance
- Dataset conversion and integration
- Detailed configuration options via JSON files
- Python 3.10 or higher
- Optional: CUDA-enabled GPU (for PyTorch with CUDA support)
Datasets that are employed in our articles can be found and downloaded on the following websites:
- UCI Machine Learning Repository: UCI Repository
- Google Drive: Download here
In case of downloading from the UCI Machine Learning Repository, one must convert the datasets to an appropriate format. For that use the provided Python file convert_datasets.py. Datasets on the Google Drive link are already converted.
Make sure you have the following packages installed:
colorama~=0.4.6
colorlog~=6.8.2
jsonschema~=4.23.0
matplotlib~=3.8.1
numpy~=1.26.4
openpyxl~=3.1.2
pandas~=2.1.0
ray~=2.32.0
seaborn~=0.13.0
scipy~=1.12.0
sklearn~=1.4.0
torch~=2.2.1+cu121
torchvision~=0.17.1+cu121
tqdm~=4.66.2
torchinfo~=1.8.0
You can install the listed packages with the following command:
pip install -r requirements.txt
Begin by cloning or downloading this repository to your local machine.
Open the data_paths.py file. You will find the following dictionary:
root_mapping = {
'ricsi': {
"STORAGE_ROOT":
"D:/storage/Journal2",
"DATASET_ROOT":
"D:/storage/Journal2/datasets",
"PROJECT_ROOT":
"C:/Users/ricsi/Documents/research/Multi_Phase_Deep_Random_Neural_Network",
}
}
You have to replace the username (in this case 'ricsi') with your own. Your username can be acquired by running the whoami
command in your terminal to retrieve it.
- Adjust this path to the location where you want to save project outputs and other data generated during the execution of the Python files.
- Modify this path to point to the directory where your datasets are stored. This folder should contain all datasets necessary for the project. It should look like this:
- D:\storage\Journal2\datasets
- connect4
- connect4.npz
- data.txt
- connect4
- Update this path to the directory where the Python and JSON files of the project are located.
Run the data_paths.py script. This will create all the required folders based on the paths specified in the configuration.
Obtain the necessary datasets and place them into the DATASET_ROOT directory as specified in your updated configuration.
Before running the Python scripts, you need to configure your settings by preparing the following JSON and Python files:
- Configuration for the FCNN (FCNN_config.json)
- Configuration for the HELM (HELM_config.json)
- Configuration for the IPMPDRNN (IPMPDRNN_config.json)
- Configuration for the MPDRNN (MPDRNN_config.json)
Once your configuration files are set up, run the Python scripts to train and test.
- Optional: It is advisable to run hyperparameter tuning, although config files contain the best settings.
- There is a separate file for hyperparameter tuning for all available networks.
- After tuning, you may execute the training and evaluation for the desired network.
- To train and evaluate the FCNN network, run execute_test.py in the fcnn folder.
- FCNN can be individually trained (train_fcnn.py) and evaluated (eval_fcnn.py)
- To train and evaluate the HELM network, run helm.py in the helm folder.
- To train and evaluate the MPDRNN network, run mpdrnn.py in the mpdrnn folder.
- To train and evaluate the FCNN network, run execute_test.py in the fcnn folder.
For detailed insights, check out our research paper.