- An iOS Framework that enables developers to use eye track information with ARKit content.
- Acquire eye tracking info
- face position & rotation
- eyes position
- lookAtPosition in World
- lookAtPoint on device screen
- blink
- distance
- Record video while acquiring data
To try the example project, simply clone this repository and open the Examples
folder
EyeTrackKit
is compatible on iOS devices that support ARKit
.
EyeTrackKit
requires:
- Swift UI
- iOS 13
- Swift 5.3 or higher
- Language: Swift
- Xcode: Xcode version 12.0.1 (12A7300)
- Libralies:
- In Xcode, select File > Swift Packages > Add Package Dependency.
- Follow the prompts using the URL for this repository.
ARVideoKit's settings (This is reference by ARVideoKit's README.md)
- Make sure you add the usage description of the
camera
,microphone
, andphoto library
in the app'sInfo.plist
.
<key>NSCameraUsageDescription</key>
<string>AR Camera</string>
<key>NSPhotoLibraryAddUsageDescription</key>
<string>Export AR Media</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>Export AR Media</string>
<key>NSMicrophoneUsageDescription</key>
<string>Audiovisual Recording</string>
-
import ARVideoKit
in the application delegateAppDelegate.swift
and aUIViewController
with anARKit
scene. -
In the application delegate
AppDelegate.swift
, add this 👇 in order to allow the framework access and identify the supported device orientations. Recommended if the application supports landscape orientations.
func application(_ application: UIApplication, supportedInterfaceOrientationsFor window: UIWindow?) -> UIInterfaceOrientationMask {
return ViewAR.orientation
}
Yuki Yamato [ukitomato]