xiaxin1998/DHCN

Data_preprocess

whatever0228 opened this issue · 6 comments

Excuse me, at present, I plan to add some other information in the data set to the session-based recommendation task to improve the recommendation accuracy (such as timestamp), and plan to cite and compare your DHCN model. According to what you mentioned in the paper, I have downloaded the original data of the data set Tmall from IJCAI-15 competition. The original Tmall data set contains the information I want to use. May I ask you about the method of preprocessing the Tmall raw data? If you can send me the preprocessing code, I would be very grateful, thank you.

Thanks for your interest.
You can refer to the preprocessed file here: https://github.com/CCIIPLab/GCE-GNN/blob/master/datasets/process_tmall.py.
We use the same preprocess method with GCE-GNN.

Thanks for your quick reply. I have also learned the GCE-GNN model. The method they gave to preprocess the data on github is for the original Diginetica data set (but the code they gave for the three data sets of Diginetica, Tmall and Nowplaying are of the same, but this code can only process the original Diginetica data set). This is also why I want to ask you about the processing method of the Tmall data set (if you do not directly use the preprocessed data of the GCE-GNN model). Thanks a lot.

My raw data of Tmall is a csv file called 'dataset15.csv'. And I use the provided tmall preprocessing file to generate train and test dataset. The preprocessing file of Tmall is different from those of other two datasets in GCE-GNN. Could you double check the provided pre_tmall.py file? This is the right file to preprocess Tmall.

Oh, My raw data set was downloaded from the address of IJCAI-15 competition data given in GCE-GNN paper: https://tianchi.aliyun.com/dataset/dataDetail?dataId=42
The data I downloaded includes some csv files like ‘user_log_format1.csv’. Could you tell me where to download the 'dataset15.csv' fire and other related data information, or can you send it to me if it is convenient for you. thank you very very much!

Thanks a lot !