Core by Alter
Core by Alter is a cross-platform core tech and an SDK powering Alter SDK and consisting of a real-time 3D avatar system and facial motion capture. It's built from scratch for web3 interoperability and the open metaverse. Easily pipe avatars into your game, app or website. It just works. Check out the included code samples to learn how to get started. Try live web demo or TestFlight.
Please star us ⭐⭐⭐ on GitHub—it motivates us a lot!
- iOS 13+
- Android 8+
- WebGL 2
- macOS (Soon)
- Windows (Soon)
- Unity (Soon)
- Unreal (Soon)
- Head only
- A bust with clothing
- A bust with clothing and background (Soon)
- Accessories only (for e.g. AR filters) (Soon)
- Full body (Soon)
42
tracked facial expressions via blendshapes- Eye tracking including eye gaze vector
- Tongue tracking
- Light & fast, just
3MB
ML model size ≤ ±50°
pitch,≤ ±40°
yaw and≤ ±30°
roll tracking coverage- 3D reprojection to input photo/video
- Platform-suited API and packaging with internal optimizations
- Simultaneous back and front camera support
- Powered by mocap4face
- Any webcam
- Photo
- Video
- Audio
- ARKit-compatible blendshapes
- Head position and scale in 2D and 3D
- Head rotation in world coordinates
- Eye tracking including eye gaze vector
- 3D reprojection to the input photo/video
- Tongue tracking
50 FPS
on Pixel 460 FPS
on iPhone SE (1st gen)90 FPS
on iPhone X or newer
Register in Alter Studio to get a unique key to access avatar data from our servers.
See our example code to see where to put the key. Look for "YOUR-API-KEY-HERE".
To run the example, simply open the attached Xcode project and run it on your iPhone or iPad.
Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".
Add this repository as a dependency to your Package.swift
or Xcode Project.
Download the AlterCore.xcframework
from this repository and drag&drop it into your Xcode Project.
To run the example, open the android-example project in Android Studio and run it on your Android phone.
Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".
Add this repository to your Gradle repositories in build.gradle:
repositories {
// ...
maven {
name = "Alter"
url = uri("https://facemoji.jfrog.io/artifactory/default-maven-local/")
}
// ...
}
// ...
dependencies {
implementation "alter:alter-core:0.14.5"
}
To run one of the provided examples, go to the js-example project and use npm install
and npm run {exampleName}
(e.g. npm run renderAvatar
or npm run deSerialization
). See package.json
for list of all examples.
Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".
Install the dependency via npm
or yarn
command.
npm install @0xalter/alter-core@0.14.5
If you are using a bundler (such as Webpack), make sure to copy the assets from @0xalter/alter-core
to your serving directory.
See our Webpack config for an example of what needs to be copied.
This library is provided under the Alter SDK License Agreement. The sample code in this repository is provided under the Alter Samples License.
This library uses open source software, see the list of our OSS dependencies and license notices.
Any app or game experience that uses an avatar as a profile picture or for character animations. The only limit is your imagination.
- Audio-only chat apps
- Next-gen profile pics
- Live avatar experiences
- Snapchat-like lenses
- AR experiences
- VTubing apps
- Live streaming apps
- Face filters
- Personalized stickers
- AR games with facial triggers
- Role-playing games
This is an alpha release software—we are still ironing out bugs, adding new features and changing the data:
- Item names within an Avatar Matrix can change
- The SDK is still not 100 % thread safe and race conditions or memory leaks can occur rarely
- Documentation is very sparse, make sure to join our Discord or file an issue if you encounter problems