This repository contains a PyTorch implementation of H-Mem from the paper "H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks" for training in the bAbI question-answering tasks.
I followed the original implementation and the details mentioned in the paper. You can find all the hyperparameters in the config.yaml
file.
Run the following command
python run_babi.py --data_dir=tasks_1-20_v1-2/en-10k --task_id=1
You have to define the data_dir
and task_id
parameters to specify the dataset's directory (1k or 10k) and the task's id, respectively.
You can also run the "Memory-dependent memorization" version by setting the read_before_write
parameter in the config.yaml
file.
- H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks
- Part of the code is borrowed from https://github.com/thaihungle/SAM
- Part of the code is borrowed from https://github.com/anantzoid/Recurrent-Entity-Networks-pytorch