HarshayuGirase/Human-Path-Prediction

Question regarding Y-Net's ETH/UCY experiments

Closed this issue · 13 comments

Hi,

First of all, thank you for this amazing work!

I am currently trying to reproduce the results for ETH/UCY, but cannot seem to achieve this. I believe the reason is because I might have some different inputs from you, e.g. ETH/UCY dataset in image coordinate, segmentation maps, reference images, etc.
Could you provide us the inputs you've used for ETH/UCY? And if possible, the configuration for these training?

Thank you

Are there any updates regarding this issue? Or do you not plan on releasing the dataset for ETH/UCY for Y-Net?

Thank you

Hi,

Sorry for the late response! I believe @DeepOzean has added the ETH/UCY semantic maps, homography matrices, and pixel-coordinate data to https://drive.google.com/file/d/1u4hTk_BZGq1929IxMPLCrDzoG3wsZnsa/view. To transform the image coordinates back into world coordinates (meters), the script in utils/image_utils/image2world should help.

Thank you for the data! @HarshayuGirase @DeepOzean
I will retrain my models to see if I can get similar results.

One quick question. Do you use augment_eth_ucy_social during training? If so, when do you use it?

def augment_eth_ucy_social(train_batches, train_scenes, train_masks, train_images):

Best

Hi,

I just double checked -- we did augment the data during training. It should be used similar to augment_data(...) function that is called for the other datasets

Hi,

Thank you for the dataset. I was able to reproduce similar numbers!

Hi, @sff1019 @HarshayuGirase
I'm also trying to reproduce the results of YNet with ETH/UCY dataset, and I would like to ask you for a little help!

How can I apply the function image2world to the augmented scenes? It seems like I need new homography matrices for the augmented(rotated, flipped) scenes, and I'm not sure how I can get them.

I would appreciate your help!

Thank you

@letme-hj All the matrices are shared via google drive (the link in README). Using that should do the trick!

@sff1019 Thank you so much for the reply!!

Just to make sure, so do you apply the same homography matrices for the rotated and flipped data as well?
For example eth_H.txt for eth_rot90, eth_rot270, eth_fliplr_rot90, etc. with no special transformation on the homography matrix?

Thank you!

@letme-hj yes, I think that should be fine! I’ve only used those homography matrices, and didn’t modify much of the code (just fixed the final evaluation part where it calculated the ADE/FDE for my own convenience) but still got very close/same numbers.

@sff1019 wow that should make it much easier for me!!
Thank you very much for sharing!! :)

Hi,

Thank you for the dataset. I was able to reproduce similar numbers!

Hi @sff1019 , would you mind sharing your training environment for ynet on ETH/CUY? Such as what kind of GPU(s) you used and how much time the training procedure cost? Thank you soooo much!

Hi @sff1019,

I'm currently also in the process of verifying the results mentioned in the paper, unfortunately with not much success so far. At this point, I'm a little stuck.
Do you maybe still have the training script and/or the model weights from your runs? If so, would you mind sharing those? Anything would be highly appreciated!

Thanks in advance!

Hi @sff1019,can you share how you modify code bout ETH/CUY ? Or can I email you ?