lucidrains/tab-transformer-pytorch

No Category Shared Embedding?

LLYX opened this issue · 3 comments

LLYX commented

I noticed that this implementation does not seem to have the feature of a shared embedding between each value belonging to the same category (unless I missed it) that the paper mentions (c_phi_i). If it's indeed missing, do you have plans to add that?

Thanks for this implementation!

ohh do you mean one category having multiple values as input?

LLYX commented

In the paper they mentioned that each category would have a shared embedding, basically an absolute positional embedding, I think?