Real-time Data Access
haoyulee95 opened this issue · 8 comments
Hello Team Aria,
I'm currently working on this project where I need to extract coordinates from eye gaze data. I'm wondering if there's an API available for this purpose. My understanding is that MPS is primarily for handling uploaded recordings, so I'm curious if there's another API specifically designed for extracting eye gaze coordinates.
I hope you have a nice day!
Hey @haoyulee95
Can you please elaborate on your request?
Are you looking for 2D eye gaze co-ordinates on RGB images?
Hello @vijay609
Yes. I am sorry for that I did not make it cleary.
Are you asking if there is access to the eye gaze model for you to extract gaze outputs on your own without sending it to MPS?
@haoyulee95 This page would be a first good point to understand how to read and process eyegaze data https://facebookresearch.github.io/projectaria_tools/docs/data_utilities/core_code_snippets/eye_gaze_code
@haoyulee95
This notebook shows how to get the estimated 2D gaze coordinates on RGB image
https://github.com/facebookresearch/projectaria_tools/blob/main/core/examples/mps_quickstart_tutorial.ipynb
Thanks @SeaOtocinclus @gsomas,
Yes, I have successfully implemented the tutorial, and it's functioning as expected. However, the data I've been working with is pre-processed by MPS, which is also provided by Meta. My issue lies in how to directly extract information from the glasses without the need for prior processing.
@gsomas
Thank you for your response. Yes, that's the issue I'm working to resolve.
@haoyulee95
The glasses currently only capture the eye images. When the user uploads the VRS to Meta (via MPS), it gets processed by out AI models to generate gaze information which is then sent back to the user.
We do not yet have a solution to extract eye gaze without uploading the data to Meta.
@vijay609,
Thank you so much for the reply.
Have a great day !