man_poses and learning behavior
anonymous-pusher opened this issue · 1 comments
Hello and thank you for this amazing work.
I am trying to reproduce the results by training the model from scratch and I followed the instruction for data preparation but I am missing some things. The model needs realistic poses to assign them 0 distance from the manifold; they are used as man_poses and are supposed to be in amass_dir according to exp config file; however, I did not see in the instructions a way of generating manifold poses from the raw data, only the part where noisy poses are generated to data_dir.
Also I noticed that the validation step is commented in the code, did you validate during training of your model ? I'm just looking for ways to make sure that I followed the correct steps to reproduce results, so any hints about how the learning is behaving during training would be helpful, especially for the eikonal term.
Thanks
however, I did not see in the instructions a way of generating manifold poses from the raw data, only the part where noisy poses are generated to data_dir.
For clean data preparation, you only need to convert the AMASS poses into quaternions:
pose_seq = np.load(AMASS_SEQ_FILE)['pose_body'][:, :63]
pose_seq = torch.from_numpy(pose_seq.reshape(len(pose_seq), 21, 3))
pose_seq = axis_angle_to_quaternion(pose_seq).detach().numpy()
print('done for....{}, pose_shape...{}'.format(seq, len(pose_seq)))
np.savez(MAN_Poses_file, pose=pose_seq)
where AMASS_SEQ_FILE are the files stored in <sampled_pose_dir> after Step 2.1
Also I noticed that the validation step is commented on in the code, did you validate during the training of your model?
Since our goal was to overfit for AMASS training poses, we did not need validation.