Hudl's `iOS-FFmpeg-processor` project aims to provide a simple way to obtain a `AVCaptureVideoPreviewLayer`, and then from the devices camera and microphone, create 8 second `.ts` segments.
Using Kickflip's strategy of writing video only rolling 50MB `.mp4`s using `AVFoundation` and then passing the bytestream into FFmpeg for processing, while at the same time taking the bytestream from the microphone, adding ADTS headers, and then passing that into FFmpeg as well, muxing the 2 streams together.
Drag and drop HudlFFmpegProcessor.xcodeproj into your current project's workspace
Add libHudlFFmpegProcessor.a, libc++.dylib, libiconv.dylib and libz.dylib to your project's Build Phases under Link Binary With Libraries
In Build Settings add $(PROJECT_DIR)/Submodules/HudlFFmpeg/HudlFFmpegProcessor (recursive) to the User Header Search Paths section and ensure Always Search User Paths is set to yes.
Sample App
The sample app should work out of the box. You might need to `rm -rf ~/Library/Develop/Xcode/DerivedData/` first though.