muelea/shapy

SHAPY predicts hugely differences in shape parameters between 2 close images.

ntrnghia opened this issue · 1 comments

Hi,

I have 2 images, 24.png and 24t.png in this Google Drive link:
https://drive.google.com/drive/folders/17HfdAsXvj87JIfFFGwoymDDj1K1w5vm4?usp=sharing
As you can see, these two images are really similar.

I ran these two images using the SHAPY regressor and I got 24.npz and 24t.npz.
When I calculate the L1 distance between betas parameters of 24npz and 24t.npz, I got:

data1 = np.load('24t.npz')['betas']
data2 = np.load('24.npz')['betas']
np.sum(np.abs(data1 - data2))
9.660576

I also got virtual measurements from them:
Processing: 24.npz
Virtual measurements: mass: 71.00 kg height: 1.75 m chest: 0.99 m waist: 0.81 m hips: 1.00 m
Processing: 24t.npz
Virtual measurements: mass: 54.81 kg height: 1.62 m chest: 0.87 m waist: 0.75 m hips: 0.94 m

As you can see, they are hugely different in betas parameters and virtual measurements.

So I don't know if this is caused by a bug in the SHAPY code, or if it's just the output of the SHAPY. As I see between their two meshes and their two images from virtual measurements, the mesh from 24 seems male but the mesh from 24t seems female, so I think maybe this is a bug.

hi @ntrnghia
I'm trying to figure out how to run the shapy model on my own RGB images, without needing any additional files like keypoints or vertices.

I would greatly appreciate it if you could provide me an explanation of how to run the code with my RGB images only - to get the measurements of the body

Thank you very much!