Eye tracking in VR

(This project is in progress)

Honours thesis regarding eye-tracking applications:

  1. Desktop-based application with Tobii device

  2. AR application on MetaQuest headset

Progress

Week 2

Unity packages:

https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.1/manual/index.html

MRTK-Unity: https://github.com/microsoft/MixedRealityToolkit-Unity MRTK2 eye-tracking only works on HoloLens. MRTK3 works on other devices (e.g.MetaQuest), requires more research on compatibilty of eye-tracking tool kits

Related work on eye-tracking

Using Eyetracking to Control my Game

How To Track Eye Movement In Augmented Reality (Part 1 Face Mesh And BlendShapes)

Unity3d with AR Foundation - How To Setup and Implement AR Eye Tracking?

MRTK on Oculus Quest 2

Add Eye Tracking Features With Oculus Integration For Unity

Week 3

VR set up in Unity

Oculus Integration SDK

Oculus XR plugin

Task: Follow the video and code the app

Source code

Week 4

Cast VR screen to PC using SideQuest download link

Look into decreasing the inconsistent movement of the rays by not moving the ray when eye gaze change is minimal

Turn the ray into dot for better selection

Week 5

eyetracking.-.Made.with.Clipchamp.1.mp4

Oculus eye-gaze api doesn't work well with side eye-gaze. To decrease the offset between the ray and the object, user needs to face directly to the object.

Week 6

Read abstraction from research paper

Title DOI Keywords Source
Prospective on Eye-Tracking-based Studies in Immersive Virtual Reality 10.1109/CSCWD49262.2021.9437692 tracking gaze points to manipulate the VR environment https://ieeexplore.ieee.org/abstract/document/9437692
A comparative study of eye tracking and hand controller for aiming tasks in virtual reality 10.1145/3317956.3318153 gaze aiming compared to controller in "aim and shoot" task; qualitative data was gathered https://dl.acm.org/doi/abs/10.1145/3317956.3318153
A Fitt's Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces 10.1145/3544548.3581423 gaze for target pre-selection; Gaze&Finger and Gaze&Handray techniques https://dl.acm.org/doi/abs/10.1145/3544548.3581423
Comparison of Eye-Based and Controller-Based Selection in Virtual Reality 10.1080/10447318.2020.1826190 Fitt's modeling of the eye-based selection https://www.tandfonline.com/doi/abs/10.1080/10447318.2020.1826190
Pinch, Click, or Dwell: Comparing Different Selection Techniques for Eye-Gaze-Based Pointing in Virtual Reality 10.1145/3448018.3457998 subjects pointed with (eye-)gaze; selected / activated the targets by pinch, clicking a button, or dwell https://dl.acm.org/doi/abs/10.1145/3448018.3457998
Evaluating ray casting and two gaze-based pointing techniques for object selection in virtual reality 10.1145/3281505.3283382 interaction techniques: ray casting, dwell time and gaze trigger in a simple object selection task https://dl.acm.org/doi/abs/10.1145/3281505.3283382
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection 10.1145/3332165.3347921 https://dl.acm.org/doi/abs/10.1145/3332165.3347921
EyePointing: A Gaze-Based Selection Technique 10.1145/3317956.3318153 technique which combines the MAGIC pointing technique and the referential mid-air pointing gesture to selecting objects in a distance https://dl.acm.org/doi/abs/10.1145/3317956.3318153

Similar eye gaze project

Week 7

Find research paper related to eye tracking vs. menu layouts

Excel sheet Details doc

Week 8 (reading break)

Week 9 - 10

  1. Fine tune eye tracking app

  2. Come up with research ideas. Link

Week 11

Develop test application that controls moving objects with controllers

movingObjController.mp4

Develop test application that controls moving objects with eye gazing and hand pinching

Demonstration

handpinching.mp4

Week 12 (December 21, 2023)

Migrated application to a new repository for advanced functionalities

New repository

New features completed:

  • Implemented ray casting from head: when the user moves their head, the ray starting from their head produces a list of objects that the ray collides with
  • Integrated with the existent eye gaze ray casting

Demonstration

Video description: the upper ray (has yellow colour) is the head control. The lower rays (has purple colour) are the eye control. Both ray types turn red when they hit an object.

demo.mp4

Week 13 - 17

Winter break

Week 18 (February 1, 2024)

  • Implement UI canvas for configuration options:
    • Technique:
      • Eye raycasting
      • Head raycasting
      • Head + Eye raycasting
      • Hand controllers
    • Object speed: 5, 10, 15
    • Object size: 10x10, 20x20, 25x25
    • Number of selection times: 5, 10, 15
  • Configured game based on user options

Week 19 (February 8, 2024)

  • Researched how to use facial expression and eye blinking to make selections (besides hand pinching)

Oculus SDK

Demonstration

  1. Select object with right eye blinking
eyeblink.1.mp4
  1. Select object with hand pinching
handpinch.1.mp4

Week 20 (March 28, 2024)

  • Add head range selection: objects within the range of head direction will slow down.
com.DefaultCompany.MovingObjController-20240328-180944.mp4
  • Connect to firebase

Screenshot 2024-03-28 181354