/Camera

Real-time camera filters with Core Image, Metal, and functional Swift

Primary LanguageSwiftMIT LicenseMIT

icon

I am recreating Instagram-like filters using Core Image, Metal, and ideas from functional programming.

At the moment, this app displays a simple Metal View rendering real-time frames coming from your camera. Frames go through the Core Image pipeline and get processed by Noir Filter. All work done on GPU.

example gif

Installation

  1. Download the main branch.
  2. Fill in your Bundle ID.
  3. Connect your iPhone and run!

Plans

  1. Render frames from an iPhone Camera in Metal View. [DONE]
  2. Add UI to take photos and choose filters. [IN PROGRESS]
  3. Write Core Image kernels for custom effects.
  4. Add support for recording videos.

Architecture

I am a big fan of The Composable Architecture, and I wanted to write a simple Camera app using principles of functional programming for something that feels "imperative" (Camera device). However, I did not import TCA and used my own simple structs (State, Action, Reducer/UseCase) because I did not want to depend on a framework.

Inspiration

  1. objc.io article on real-time image filtering with Core Image
  2. objc.io article on camera capture
  3. This guy's awesome tutorial
  4. TCA
  5. A classic: Functional Core, Imperative Shell