lucidrains/point-transformer-pytorch

请问,论文中的self-Attention对基数(cardinality)不变,这怎么理解?

swzaaaaaaa opened this issue · 0 comments

请问,论文中的self-Attention对基数(cardinality)不变,这怎么理解?