JDAI-CV/DSD-SATN

SMPL mesh to world coord

sunwonlikeyou opened this issue · 4 comments

As I know provided smpl dataset is defined wih camera-coordinate.
but i'd like to see smpl mesh in world coordinate.
so i converted

root_rotation = smpl_pose_parameter[:3] 
root_rotation = inverse(extrinsic parameter)@root_rotation
smpl_pose_parameter[:3] = root_rotation
smpl_mesh, smpl_joint = smpl.layer['female'](smpl_pose_parameter,smpl_shape_parameter)
projected_2d_vertices = projection_matrix @ smpl_mesh

image

but rotation was different with original rgb image.
how do i convert root rotation?

Actually, the root_rotation we provided is already considering the camera pose. So you don't need have to multiple the extrinsic matrix.

Thank you for your reply.
then as I know,

mosh_data = np.load(annots_train_bilinearf.npz)['params'][()]
cam = mosh_data[:3]
theta = mosh_data[3:75]
beta = mosh_data[75:]

smpl_mesh, smpl_joint = smpl.layer['female'](smpl_pose_parameter,smpl_shape_parameter)
projection_matrix = intrinsic@extrinsic
projected_2d_vertices = projection_matrix @ smpl_mesh

Is it Right? I didnt do any conversion, But i got this result
image

How to get global oriented SMPL mesh...?
and What is cam?

Please don't multiply any camera matrix, such as projection_matrix you mul.
The rendered 3D mesh in the right is just captured by the camera placed in a wrong place. Please adjust your rendering camera parameters, especially the translation.
If you shoot the 3D mesh from a camera in the back and exchange the x and y axis, you will get the right rendering.

The cam is (scale, trans_x, trans_y). They are weak-perspective camera parameter we predict to describe the mesh location in the image.

well , I should use 'annot.npz' not 'annots_train_bilinearf.npz'.
It has global rotation. Thank you