Error about quantizing 3DGS checkpoint
Aeson-Hsu opened this issue · 1 comments
Hi, I get an error when I run vectree.py
.
================== Print Info ==================
Input_feats_shape: torch.Size([1554770, 62])
VQ_feats_shape: torch.Size([1554770, 27])
SH_degree: 2
Quantization_ratio: 0.6
Add_important_score: True
Codebook_size: 8192
================================================
IS_percent: tensor(0.7985)
100%|██████████| 1000/1000 [01:00<00:00, 16.41it/s]
=============== Start vector quantize ===============
100%|██████████| 190/190 [00:01<00:00, 186.61it/s]
updating: ../vectree/output/bicycle/extreme_saving/ (stored 0%)
updating: ../vectree/output/bicycle/extreme_saving/metadata.npz (deflated 12%)
updating: ../vectree/output/bicycle/extreme_saving/non_vq_feats.npz (deflated 0%)
updating: ../vectree/output/bicycle/extreme_saving/xyz.npz (deflated 0%)
updating: ../vectree/output/bicycle/extreme_saving/non_vq_mask.npz (deflated 0%)
updating: ../vectree/output/bicycle/extreme_saving/other_attribute.npz (deflated 0%)
updating: ../vectree/output/bicycle/extreme_saving/codebook.npz (deflated 0%)
updating: ../vectree/output/bicycle/extreme_saving/vq_indexs.npz (deflated 0%)
Size = 70.69165706634521 MB
==================== Load saved data & Dequantize ====================
Traceback (most recent call last):
File "/home/zxq/MachineLearning/SLAM/3DGS/LightGaussian/vectree/vectree.py", line 224, in
vq.dequantize()
File "/home/zxq/MachineLearning/SLAM/3DGS/LightGaussian/vectree/vectree.py", line 215, in dequantize
write_ply_data(dequantized_feats.cpu().numpy(), self.ply_path, self.sh_dim)
File "/home/zxq/MachineLearning/SLAM/3DGS/LightGaussian/vectree/utils.py", line 101, in write_ply_data
elements[:] = list(map(tuple, feats))
ValueError: could not assign tuple of length 62 to structure with 41 fields.
I found that the shape of feats
is [1554770, 62] and the dtype of elements
is [1554770, 41], which caused the error. Besides, elements
dtype is defined by dtype_full
:
dtype_full = [(attribute, 'f4') for attribute in construct_list_of_attributes()]
So, I'd like to know the reason why the demensions between dtype_full
and feats
is inconsistent.
Thanks in advance!
Hi, Thank you for your interest in our work :)
This inconsistency is likely caused by an incorrect SH degree setting. You might want to add the argument "--sh_degree 3" for python vectree/vectree.py if you are using the default SH degree for 3DGS.