Generative Wall Art with Apple Vision Pro
A compact example project that demonstrates the core functionalities of visionOS, accompanied by a Youtube tutorial series where we code this example from scratch.
If you want to get started with development for Apple Vision Pro, this is your perfect starting point.
Tutorials
- Part 0: Building a Vision Pro App with SwiftUI & RealityKit
- Part 1: Window Group & Immersive Space
- Part 2: Reality Kit Entities & Anchors
- Part 3: SwiftUI Attachments, Combine, Reality Composer Pro Animations
- More Videos coming soon!
visionOS APIs Used
- Scene Types (WindowGroup & ImmersiveSpace)
- RealityKit
- AnchorEntity (Plane Detection & Head Tracking)
- ModelEntity
- BillboardSystem
- ParticleEmitterComponent
- SwiftUI Attachments
- SimpleMaterial
- TextureResource
- SwiftUI
- RealityView
- Observable Macro
- Animations
- UIKit
- UIBezierPath
3D Content
This demo handles 3D content in two ways. First, the main character including its animations and particle effects is created in Reality Composer Pro. Second, the image canvas and its resources are created programmatically in Swift.
Setup
To run this project you need Xcode 15 Beta 5+ and visionOS 1.0 which you can download here.
- Clone the repo.
- Open the project in Xcode 15.
- Select the root project GenerativeWallArt in the view hierarchy.
- Go to Signing & Capabilities.
- Select your development team.
- Select the Apple Vision Pro simulator as a build target.
- Build and run the project.
- Select the museum scene, and move the character to the empty wall.
- Tap the character to start the demo.