FLIP: Fine-grained Alignment between ID-based Models and Pretrained Language Models for CTR Prediction
This is the pytorch implementation of FLIP proposed in the paper FLIP: Fine-grained Alignment between ID-based Models and Pretrained Language Models for CTR Prediction. (RecSys'24)
pip install -r requirments.txt
To pretrain FLIP, please run
run_script.py
To finetune the ID-based model (finetune_ctr.py
To finetune the PLM-based model (finetune_nlp.py
To finetune both the ID-based model and PLM-based model (FLIP), please run funetine_all.py
We also provide shell scripts for baselines.
To run the ID-based model
baseline:
python ctr_base.py
To run the PLM-based model
baseline:
python ctr_bert.py
python ptab.py