ma-xu/pointMLP-pytorch

Questions about paper

liller opened this issue · 3 comments

liller commented
Questions about paper
liller commented

Thanks a lot for your contribution to this field @ma-xu .
Here are some questions and many thanks for your reply in advance.

  1. Paper mentions one conclusion is:
    "Since PointMLP only leverages MLPs, it is naturally invariant to permutation, which perfectly fits the characteristic of point clouds."
    Why here say MLPs are naturally invariant to permutation? In the PointNet network, the author also applies the MLPs structure but additionally uses the T-Net module to achieve permutation invariance.
  2. When calling the 'getitem ' method to load data, whether using the shuffle operation will affect the form of the 3D features and neighbour features of each centre point?
  3. According to the training process, it could be seen that the test accuracy (both instance accuracy and class accuracy) is higher than the training accuracy at the early stage. Is this normal and can you explain to me the reason for this?
ma-xu commented

Hi @liller Thanks for your kind words.

  1. MLP-based method is invariant to permutation of points. That is, changing the order of inputs will give same outputs, natually fits point cloud, unordered data.
  2. No, all methods calculate nerighbors via distance, not the input order because point clouds are unordered.
  3. It is a common case and you can even oberve this from ImageNet. Here are some explainations from others (in chinese, you can traslate to English): https://www.zhihu.com/question/270731692
liller commented

Got it and thanks for your explanation.