About noisy poses
HospitableHost opened this issue · 2 comments
Are you sure that adding noise to the poses's code is 'sampled_pose = sampled_pose +sigma*np.random.rand(21,4)*sampled_pose'?
Why *sampled_pose at last?
Hi,
We create noisy poses to generate poses, which are not part of the manifold. This can be done in many ways:
- Just create random rotations and calculate their distance using nearest neighbor. This will generate the poses, which are mostly very bad, do not resemble good poses and will have large distances
- Add noise to AMASS poses. For this, you can do either
2.1 sampled_pose = sampled_pose +sigmanp.random.rand(21,4)
2.2 sampled_pose = sampled_pose +sigmanp.random.rand(21,4)*sampled_pose
Ideally to learn more fine detailed manifold, we also wanted to generate sampled which are very close to manifold. In second case, the added noise is also controlled by the original pose and will generate poses which are more close to sampled poses. Practically we found that given the huge no. of samples and large size of AMASS, both 2.1 and 2.2 produces similar distribution. But we used 2.2 for our data generation.
Hi, We create noisy poses to generate poses, which are not part of the manifold. This can be done in many ways:
- Just create random rotations and calculate their distance using nearest neighbor. This will generate the poses, which are mostly very bad, do not resemble good poses and will have large distances
- Add noise to AMASS poses. For this, you can do either
2.1 sampled_pose = sampled_pose +sigma_np.random.rand(21,4)
2.2 sampled_pose = sampled_pose +sigma_np.random.rand(21,4)*sampled_poseIdeally to learn more fine detailed manifold, we also wanted to generate sampled which are very close to manifold. In second case, the added noise is also controlled by the original pose and will generate poses which are more close to sampled poses. Practically we found that given the huge no. of samples and large size of AMASS, both 2.1 and 2.2 produces similar distribution. But we used 2.2 for our data generation.
Okay, thank you very much for your reply!