luoao-kddi/SCP

Suspected bug, do you know how to solve it?

Closed this issue · 8 comments

Thanks for the great open-source work.
I get this error when Eval, do you know how to solve it?

Traceback (most recent call last):
File "/home/rain/SCP-main/encode_mullevel.py", line 260, in
main(args)
File "/home/rain/SCP-main/encode_mullevel.py", line 198, in main
for i, (cur_file, batch) in enumerate(zip(test_files, test_loader)):
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 633, in next
data = self._next_data()
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 677, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/rain/SCP-main/dataloaders/encode_dataset_ehem_mullevel.py", line 32, in getitem
npy_paths, whole_pc, chamfer, bin_num, z_offset, psnr = self.preproc(ori_file)
File "/home/rain/SCP-main/dataloaders/encode_dataset_ehem_mullevel.py", line 152, in preproc
bin_num, chamfer = np.load(preproc_path + '_meta.npy')
ValueError: too many values to unpack (expected 2)

I appreciate your interest in this work. I think this is a bug I neglected.
Please change the 152 line of dataloaders/encode_dataset_ehem_mullevel.py as

bin_num, chamfer, _ = np.load(preproc_path + '_meta.npy')

Thanks for your reply, the problem has been solved.
But I'm sorry, it seems that I have encountered a new problem. Do you know how to solve it?
Looking forward to your reply.

Traceback (most recent call last):
File "/home/rain/SCP-main/encode_mullevel.py", line 260, in
main(args)
File "/home/rain/SCP-main/encode_mullevel.py", line 198, in main
for i, (cur_file, batch) in enumerate(zip(test_files, test_loader)):
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 633, in next
data = self._next_data()
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 677, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/rain/anaconda3/envs/scp/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/rain/SCP-main/dataloaders/encode_dataset_ehem_mullevel.py", line 32, in getitem
npy_paths, whole_pc, chamfer, bin_num, z_offset, psnr = self.preproc(ori_file)
ValueError: not enough values to unpack (expected 6, got 5)

You can remove the z_offset in this line. This is caused by the mismatching of returned variables.
z_offset is not used for Spherical and Cylindrical coordinates, so it is OK to ignore z_offset.

Thank you very much for your reply, which solved the problem.
But the value of the generated PSNR is [0.0]. Do you know what the reason is?
At the same time, I also want to ask a question, how to use the generated bin file to decompress the radar point cloud.

Looking forward to your reply.

outputfile : outputs/kitti/2024-05-10/16-19-21/test_output/epoch=7-step=268472/17000033_spher_45_13124_.bin
time(s) : 4.639948129653931
pt num : 113897
oct num : 331640
total binsize : 783368
bit per oct : 2.3621034857073937
bit per pixel : 6.87786333266021
0.0 6.87786333266021 0.0026900929897611223 4.639948129653931
0.0 6.87786333266021 0.0026900929897611223 4.639948129653931
bpps: [6.87786333266021]
chamfer_dist: [0.0026900929897611223]
PSNR: [0.0]

The PSNR can be generated by data_preproc/psnr_test.py. One example is

python data_preproc/psnr_test.py --type lidar --ori_dir data/kitti/your_ori_data --out_dir data/kitti/your_generated_data

Thank you very much for your reply and tell me how to calculate PSNR.

However, I saw that the reconstructed point cloud needs to be given in the command. Can you tell me how to get the reconstructed point cloud? Thank you very much。

I'm looking forward to hearing back from you.

One thing I need to mention is that SCP is a lossless compression method.
The distortion is only caused by quantization. Therefore, you can use the preprocessed data (generated by data_preproc /test_gene.py) as the reconstructed data for PSNR calculation.
The xxx_quant.ply files are generated after you generate the data by data_preproc /test_gene.py

Oh, thanks for your reply