I dont see anywhere 'mask_feat_list' argument in embedding_lookup function.
Jeriousman opened this issue · 1 comments
Jeriousman commented
Please refer to the FAQ in doc and search for the related issues before you ask the question.
Describe the question(问题描述)
I get an error I think because of this. This is the full track of error.
dnn_input_emb_list = embedding_lookup(X, model.embedding_dict, model.feature_index, model.sparse_feature_columns,
mask_feat_list= model.item_features, to_list=True)
Traceback (most recent call last):
File "<ipython-input-201-f7445a26a1d5>", line 2, in <module>
mask_feat_list= model.item_features, to_list=True)
File "/home/hojun/anaconda3/envs/ai/lib/python3.6/site-packages/deepctr_torch/inputs.py", line 206, in embedding_lookup
emb = sparse_embedding_dict[embedding_name](input_tensor)
File "/home/hojun/anaconda3/envs/ai/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/hojun/anaconda3/envs/ai/lib/python3.6/site-packages/torch/nn/modules/sparse.py", line 126, in forward
self.norm_type, self.scale_grad_by_freq, self.sparse)
File "/home/hojun/anaconda3/envs/ai/lib/python3.6/site-packages/torch/nn/functional.py", line 1852, in embedding
return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
IndexError: index out of range in self
As you see the function does not have mask_feat_list anywhere. Is it right that I get an error because of this?
def embedding_lookup(X, sparse_embedding_dict, sparse_input_dict, sparse_feature_columns, return_feat_list=(),
mask_feat_list=(), to_list=False):
"""
Args:
X: input Tensor [batch_size x hidden_dim]
sparse_embedding_dict: nn.ModuleDict, {embedding_name: nn.Embedding}
sparse_input_dict: OrderedDict, {feature_name:(start, start+dimension)}
sparse_feature_columns: list, sparse features
return_feat_list: list, names of feature to be returned, defualt () -> return all features
mask_feat_list, list, names of feature to be masked in hash transform
Return:
group_embedding_dict: defaultdict(list)
"""
group_embedding_dict = defaultdict(list)
for fc in sparse_feature_columns:
feature_name = fc.name
embedding_name = fc.embedding_name
if (len(return_feat_list) == 0 or feature_name in return_feat_list):
# TODO: add hash function
# if fc.use_hash:
# raise NotImplementedError("hash function is not implemented in this version!")
lookup_idx = np.array(sparse_input_dict[feature_name])
input_tensor = X[:, lookup_idx[0]:lookup_idx[1]].long()
emb = sparse_embedding_dict[embedding_name](input_tensor)
group_embedding_dict[fc.group_name].append(emb)
if to_list:
return list(chain.from_iterable(group_embedding_dict.values()))
return group_embedding_dict
zanshuxun commented
please provide more infomation:
To Reproduce(复现步骤)
Steps to reproduce the behavior:
- Go to '...'
- Click on '....'
- Scroll down to '....'
- See error
Operating environment(运行环境):
- python version [e.g. 3.5, 3.6]
- torch version [e.g. 1.6.0, 1.7.0]
- deepctr-torch version [e.g. 0.2.7,]