/EyeTrackKit

This is an ios framework for eye tracking on iPhone using ARKit

Primary LanguageSwiftMIT LicenseMIT

EyeTrackKit

  • An iOS Framework that enables developers to use eye track information with ARKit content.

Key Features

  • Acquire eye tracking info
    • face position & rotation
    • eyes position
    • lookAtPosition in World
    • lookAtPoint on device screen
    • blink
    • distance
  • Record video while acquiring data

Example Projects

To try the example project, simply clone this repository and open the Examples folder

Compatibility

EyeTrackKit is compatible on iOS devices that support ARKit.

EyeTrackKit requires:

  • Swift UI
  • iOS 13
  • Swift 5.3 or higher

Develop Environment

Installation

Swift Package Manager (available Xcode 11.2 and forward)

  1. In Xcode, select File > Swift Packages > Add Package Dependency.
  2. Follow the prompts using the URL for this repository.

ARVideoKit's settings (This is reference by ARVideoKit's README.md)

  1. Make sure you add the usage description of the camera, microphone, and photo library in the app's Info.plist.
<key>NSCameraUsageDescription</key>
<string>AR Camera</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>Export AR Media</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>Export AR Media</string>
<key>NSMicrophoneUsageDescription</key>
<string>Audiovisual Recording</string>
  1. import ARVideoKit in the application delegate AppDelegate.swift and a UIViewController with an ARKit scene.

  2. In the application delegate AppDelegate.swift, add this 👇 in order to allow the framework access and identify the supported device orientations. Recommended if the application supports landscape orientations.

func application(_ application: UIApplication, supportedInterfaceOrientationsFor window: UIWindow?) -> UIInterfaceOrientationMask {
    return ViewAR.orientation
}

Licence

MIT

Author

Yuki Yamato [ukitomato]