A toolkit to create vfx based on the environment and humen 3D data (retrived from lidar sensor on AR enabled devices) in AR. This is a noncommercial tool to explore the possibility of visual effects in AR environment, be prepared to unexpected jittery or unstable results on mobile devices.
- Unity 2021.2 or later
- Universal Rendering Pipeline
- VFX Graph package
- AR Foundation + ARKit (ARCore is not tested, but theoretically it will work)
- AR enabled devices with Lidar sensor(IPad pro, IPhone 12 Pro etc.)
- Install and config AR foundation correctly follow this instruction
- Compile one of the sample scenes from https://github.com/Unity-Technologies/arfoundation-samples and make sure it runs successfully on your device before moving onto the next step.
- Pull the toolkit branch from this repo. Build the example scene and try it out.
- LidarDataProcessor : to process environmental and human data (eg. depth/ stencil), prepared vfx binder ready data. how to use: Add an empty game object to your scene, and add LidarDataProcessor to it, then drag the respective components to the parameter list from your scene.
- VFXLidarDataBinder : to bind the AR scene data to VFX graph. how to use: Add visual effect and vfx binder components to your vfx gameobject, click the "+" button from vfx binder and select Lidar Data. (in VFXPropertyMenu, select the property you will use in your VFXGraph)
- Environment Mesh Position : read vertices from EnvironmentMesh, set positions for particles
- Human Froxel : set particle positions to human (refer to DemoExample VFXGraph to see how to use)
- Kill Nonhuman : remove particles that are outside of human stencil (refer to DemoExample VFXGraph to see how to use)
- Environment
ARVFX_ENV0_Trim_Trim.mp4
ARVFX_ENV1.mp4
- Human