[Dataset] [ASE] - Plotting 3D colored depth map
AnarchistKnight opened this issue · 6 comments
Good day.
I tried to reconstruct a ASE scene and failed. I guess I might miss something when computing the camera pose matrix or when converting depth to z. My workflow is as follows:
- Scene 8 from ASE dataset is used.
- The rgb and depth images are rectified, as example shown in projectaria_tools/core/examples/dataprovider_quickstart_tutorial.ipynb. I finally got images of 512x512 size and they look right. The left is the original and the right is the undistorted.
- Depth is converted to z using the function below.
CORRECTION_PARAM = 1000.0
# fx = 150, fy = 150, cx = 256, cy = 256
def convert_depth_to_z(fx, fy, cx, cy, depth_image_path):
depth_image = cv2.imread(depth_image_path, cv2.IMREAD_UNCHANGED)
constant_x = 1 / fx
constant_y = 1 / fy
vs = np.array([(v - cx) * constant_x for v in range(0, depth_image.shape[1])])
us = np.array([(u - cy) * constant_y for u in range(0, depth_image.shape[0])])
z_image = np.square(depth_image / CORRECTION_PARAM)
scale = 1 + np.square(vs[np.newaxis, :]) + np.square(us[:, np.newaxis])
z_image = np.sqrt(z_image / scale) * CORRECTION_PARAM
return z_image.astype(np.uint16)
- The camera_to_world matrix is computed using this function
from scipy.spatial.transform import Rotation
def c2w(tx, ty, tz, qx, qy, qz, qw):
quaternion = [qx, qy, qz, qw]
rotation = Rotation.from_quat(quaternion)
rotation_matrix = rotation.as_matrix()
camera_pose_matrix = to_homo(rotation_matrix)
camera_pose_matrix[0][3] = tx
camera_pose_matrix[1][3] = ty
camera_pose_matrix[2][3] = tz
return camera_pose_matrix
- I used open3d to try to reconstruct the scene. The result for single frame at 0 looks correct.
- The result for single frame at 18 looks wrong. The camera trajectory goes into the wall.
- The reconstruction for multple frames 82-92 looks a bit corrupted, difference frames do not match to each other.
- The reconstruction for the whole sequence looks a mess.
Solved. Hope this issue would help people in the future.
I notice code here
def build_camera_frustum(transform_world_device):
points = (
np.array(
[[0, 0, 0], [0.5, 0.5, 1], [-0.5, 0.5, 1], [-0.5, -0.5, 1], [0.5, -0.5, 1]]
)
* 0.6
)
transform_world_rgb = transform_world_device @ T_device_RGB
points_transformed = transform_world_rgb @ points.transpose()
return go.Mesh3d(
x=points_transformed[0, :],
y=points_transformed[1, :],
z=points_transformed[2, :],
i=[0, 0, 0, 0, 1, 1],
j=[1, 2, 3, 4, 2, 3],
k=[2, 3, 4, 1, 3, 4],
showscale=False,
visible=False,
colorscale="jet",
intensity=points[:, 2],
opacity=1.0,
hoverinfo="none",
)
I print out T_device_RGB and get
SE3 (quaternion(w,x,y,z), translation (x,y,z)) (x1)
[[0.9431408157221322, 0.3288729608045007, 0.02765579379254213, 0.03953649504415866, -0.004247214591779491, -0.01209833440626544, -0.005054953042539318]]
The I make the following modifications
def get_default_device_matrix():
qw = 0.9431408157221322
qx = 0.3288729608045007
qy = 0.02765579379254213
qz = 0.03953649504415866
tx = -0.004247214591779491
ty = -0.01209833440626544
tz = -0.005054953042539318
return c2w(tx, ty, tz, qx, qy, qz, qw)
c2w_matrix = c2w(tx, ty, tz, qx, qy, qz, qw) @ get_default_device_matrix()
Finally, it works.
@AnarchistKnight Happy to hear that you found a solution to your problem and thank you for sharing your code, so anyone can rely on your solution if they have the same problem
@AnarchistKnight Thanks for kindly sharing information! Could you please share your full script or the code for undistortion? I am also working on this but have been getting poor results. I suspect it might be because I used a different method for undistortion (#11). Thank you very much!
@AnarchistKnight Thanks for kindly sharing information! Could you please share your full script or the code for undistortion? I am also working on this but have been getting poor results. I suspect it might be because I used a different method for undistortion (#11). Thank you very much!
search "undistort an image" in this and you would get the answer
@AnarchistKnight Thanks for kindly sharing information! Could you please share your full script or the code for undistortion? I am also working on this but have been getting poor results. I suspect it might be because I used a different method for undistortion (#11). Thank you very much!
solved?
@AnarchistKnight Yes, thank you!