Visualization
hardikdava opened this issue · 5 comments
Thank you for your amazing work. I have 2 questions regarding visualization.
- Is there anyway to visualize
.parquet
file with any web-based viewer or locally cpu-based machine? - Is it possible to convert
.parquet
to some another known point cloud or mesh format?
Web visualizer based on vulkan is in future plans, but not available for now. I'm still trying to implement the radix sort by taichi, but meet some problem for int64 support.
The parquet file is just a pandas table, the columns are very straightforward. You can just take the x,y,z,alpha columns to build any point cloud file. However, most of the points in the point cloud are almost transparent, I've tried to build mesh by the point cloud but the result is not very good. Maybe MarchCube method still works, but it requires a lot more code.
@wanmeihuali Thanks for your reply. I was able to extract pointcloud from .parquet
file. Is there anyway to get textured mesh output? Like converting feature vectors to meaningful RGB values?
@hardikdava
Currently, there isn't an efficient way to obtain a textured mesh output. The difficulties stem from two aspects:
You might have noticed that the alpha value in the parquet represents the opacity of each point. Algorithms like the one used here, akin to NeRF, assume all 3D objects are semi-transparent and render them through volume rendering. The issue arising from this is that, unlike traditional point clouds where points are distributed on the surface of objects, the point cloud generated by this algorithm contains numerous near-transparent points distributed near the surface of objects. The color stacking of these points accurately portrays the surface of the object. Conventional point-cloud-to-mesh algorithms struggle with this scenario, often not taking into account the opacity of the points at all. My previous experiments in this area haven't been very successful. You might need a customized point-cloud-to-mesh conversion algorithm.
Even if you manage to extract a mesh from this point cloud, obtaining the corresponding colors remains imprecise. This is because the current algorithm employs spherical harmonic functions to represent colors, meaning that the color of the same point changes depending on the camera angle. For approximate color values, I would suggest referring to the r_sh0, g_sh0, and b_sh0 columns, taking the sigmoid of each, and then mapping the result to an integer between 0 to 255. Though this color estimation may not be entirely accurate, it shouldn't be too far off.
Is it possible to use this webgl based viwer for visualizing the data?
https://antimatter15.com/splat/
@hardikdava It should be possible, but you might need to modify the color computing formula by adding a sigmoid after the SH function.