Issues
- 1
- 12
在attention里面,vec1*vec2的背后逻辑是什么?
#54 opened by lijian10086 - 0
Can not reproduce the DIN result in README.md
#92 opened by liyangliu - 1
我实现的mini-batch aware regularization,求拍砖
#82 opened by bound2020 - 2
关于DIN复现的结果问题
#57 opened by hit-joseph - 3
DIN模型中self.j
#78 opened by xiaopp123 - 1
对代码与论文模型图中的几个疑问
#90 opened by LXXiaogege - 1
Attention 中为什么用 softmax 把权重归一? 和原论文的表述不一致
#91 opened by wenmin-wu - 3
模拟随机数据 auc 特别高0.877
#66 opened by LHFCOD - 0
train速度太慢,是否有多卡运行版本
#89 opened by lljjgg - 0
wide_deep 中 wide 部分的实现
#88 opened by mikudehuane - 4
- 4
DIN model中为何没有拼接user vector
#64 opened by zhiweishen - 0
感觉input.py那里hist_i用0补全是有问题的
#85 opened by miaotianmiaodi - 0
- 2
关于代码中的bn实现部分的疑问
#83 opened - 0
关于可以运行的Electronics数据集
#84 opened by BelindaYanyiwuhua - 0
输入问题
#81 opened by huozijie1990 - 1
why use sigmoid in Attention
#80 opened by huaweiboy - 0
attention_multi_items 的问题?
#79 opened by Peterisfar - 3
- 3
- 1
DIN模型处理dense feature的相关问题
#47 opened by eckolemon - 4
din模型实际业务应用
#44 opened by andrew-zzz - 1
该模型是针对一个广告做训练吗?Is the model training for an AD?
#58 opened by manji460 - 1
movielen数据集如何预处理?
#75 opened by Qian-Hao - 2
代码存在严重bug: label泄露
#76 opened by starspringcloud - 2
不定长的hist如果不进行padding是否更合适
#77 opened by DSXiangLi - 1
DIN的输入只包括商品、类目以及历史行为商品吗?
#48 opened by waterbeach - 2
- 1
Low AUC after fixed BN bugs
#63 opened by liuhu-bigeye - 3
Different AUC implementation with the paper
#71 opened by liuhu-bigeye - 1
- 2
- 1
Questions about Wide and deep model
#45 opened by czisok - 1
关于attention的实现
#50 opened by ZhengYukai666 - 1
代码写的太差了,p8大佬
#67 opened by weiyu322 - 1
有关movielens数据集的问题
#68 opened by yuanzhiyu - 0
Why Scale
#72 opened by LittleMisss - 2
- 2
请问必须显存>=10G才能跑吗~
#46 opened by Slayerxxx - 0
- 0
我save model后model特别大可能是什么问题
#59 opened by cuiweizuishuai - 2
- 0
DataInput类hist的疑问
#56 opened by Slayerxxx - 2
使用其他模型(如LR)作为对比试验时,是否也需要输入历史ID信息
#43 opened by tinkle1129 - 0
DeepFM代码中的一个问题
#53 opened by zhangbei121 - 0
关于实际问题中的序列构造问题
#49 opened by caojiajia - 0
- 2