This is our TensorFlow implementation for the paper:
Dual-interest Factorization-heads Attention for Sequential Recommendation
The code is tested under a Linux desktop with TensorFlow 1.12.3 and Python 3.6.8.
The script is reco_utils/dataset/sequential_reviews.py
which can be excuted via:
python examples/00_quick_start/sequential.py --is_preprocessing True
To train our model on Amazon
dataset (with default hyper-parameters):
python examples/00_quick_start/sequential.py
The implemention of self attention is modified based on TensorFlow framework of Microsoft.