una-dinosauria/human-motion-prediction

How to visualise a motion from the dataset

MetaDev opened this issue · 4 comments

I simply want to visualise a motion from the dataset without having to train a model.
From the flow of your script it is not obvious how to do it, might it be possible to elaborate on how to do it?
Thanks.

Hi @MetaDev,

You can use the forward_kinematics.py script almost as is, except you have to read from the .txt ground truth files.

Instead of

# numpy implementation
with h5py.File( 'samples.h5', 'r' ) as h5f:
expmap_gt = h5f['expmap/gt/walking_0'][:]
expmap_pred = h5f['expmap/preds/walking_0'][:]

Do something like

  # Read data from .txt file
  data_dir = "./data/h3.6m/dataset"
  test_subject_ids = [5]
  actions = ["walking"]
  one_hot = False
  test_set, _ = data_utils.load_data( data_dir, test_subject_ids, actions, one_hot )

  subject = 5
  subaction = 1
  expmap_gt = test_set[(subject, 'walking', subaction, 'even')]
  # print( expmap_gt, expmap_gt.shape )
  expmap_pred = np.zeros_like( expmap_gt )

This will visualize the ground truth for walking.

If you want to see subjects other than 5, you will have to obtain another offset (bone lengths) array in _some_variables() from the matlab code that is provided with human3.6M.

Hope that helps!

That's perfect, thank you for the explanation.

Hi! Do you have the image? It doesn't seem to have been attached.

Re: 28 joints. I believe some of them are repeated. You may want to try plotting a number with text next to the joint and you'll see what I mean.