Possible error with Masknet?
Closed this issue · 4 comments
Hi, thank you for your great work @vinits5! I'm looking forward to test Masknet on my own RGBD data.
I just encountered this following issue when running test_masknet.py in Google colab:
learning3d/examples/test_masknet.py
Line 21 in 45bebb7
Error raised in pointnet2 module in utils!
Either don't use pointnet2_utils or retry it's setup.
INFO - 2021-05-17 01:41:24,679 - ppfnet - Using early fusion, feature dim = 96
INFO - 2021-05-17 01:41:24,680 - ppfnet - Feature extraction using features xyz, dxyz, ppf
Traceback (most recent call last):
File "test_masknet.py", line 21, in <module>
from learning3d.models import MaskNet
File "/content/drive/MyDrive/learning3d/models/__init__.py", line 16, in <module>
from .deepgmr import DeepGMR
File "/content/drive/MyDrive/learning3d/models/deepgmr.py", line 162
'r': template_features - source_features,
^
SyntaxError: invalid syntax
I'm not exactly sure whether this syntax error was already there, or something happened because I tried to modify some part of test_masknet.py so that I can directly feed .ply file (instead of using .hdf5 format):
Line 161 in 45bebb7
Comma is missing in between line 161 and 162. I put a pull request for this.
Ok, there are some errors in directory setting:
learning3d/examples/test_masknet.py
Line 21 in 45bebb7
This line should be like, "from models import MaskNet" and same for the "data_utils"
Line 10 in 45bebb7
Also, all ".." in front of ops and utils when importing them should be removed for all model.py
So, I tested how far masknet can perform with a random object point clouds (1 fragment pc and 1 reference pc which have never been shown during the training) and there's no difference between template and masked_template.... Both pcs have 2200 points.
def test_one_epoch(args, model):
model.eval()
test_loss = 0.0
pred = 0.0
# count = 0
# precision_list = []
source_ply = o3d.io.read_point_cloud('/content/drive/MyDrive/learning3d/ply/Object1_1.ply')
source = np.asarray(source_ply.points).reshape(1, -1, 3)
mesh = o3d.io.read_triangle_mesh('/content/drive/MyDrive/learning3d/ply/Object1_0.ply')
template_ply = mesh.sample_points_uniformly(number_of_points = 2200) # Sampling
template = np.asarray(template_ply.points).reshape(1, -1, 3)
template = (torch.from_numpy(template).to(args.device)).float()
source = (torch.from_numpy(source).to(args.device)).float()
# igt = igt.to(args.device) # [source] = [igt]*[template]
# gt_mask = gt_mask.to(args.device)
masked_template, predicted_mask = model(template, source)
print(predicted_mask.cpu().data.numpy().argmax())
# np.save('predicted_mask', predicted_mask.cpu().data.numpy().argmax())
# Evaluate mask based on classification metrics.
# accuracy, precision, recall, fscore = evaluate_mask(gt_mask, predicted_mask, predicted_mask_idx = model.mask_idx)
# precision_list.append(precision)
# Different ways to visualize results.
display_results(template.detach().cpu().numpy()[0], source.detach().cpu().numpy()[0], masked_template.detach().cpu().numpy()[0])
# print("Mean Precision: ", np.mean(precision_list))
Ok, I understood the problem. My initial transformation of the fragment pc is deviated too much from the reference pc.