/PrototypeKit

A swift package to make prototyping machine learning experiences for Apple Platforms more accessible to early developers.

Primary LanguageSwiftApache License 2.0Apache-2.0

PrototypeKit

Swift

(Ironically, a prototype itself...) 😅

Status: Work In Progress

Goals 🥅

  • Make it easier to prototype basic Machine Learning apps with SwiftUI
  • Provide an easy interface for commonly built views to assist with prototyping and idea validation
  • Effectively a wrapper around the more complex APIs, providing a simpler interface (perhaps not all the same functionality, but enough to get you started and inspired!)

Examples

Here are a few basic examples you can use today.

Camera Tasks

Start Here

  1. Ensure you have created your Xcode project
  2. Ensure you have added the PrototypeKit package to your project (instructions above -- coming soon)
  3. Select your project file within the project navigator.
Screenshot 2024-02-02 at 3 42 28 pm
  1. Ensure that your target is selected
Screenshot 2024-02-02 at 3 43 22 pm
  1. Select the info tab.
  2. Right-click within the "Custom iOS Target Properties" table, and select "Add Row"
Screenshot 2024-02-02 at 3 44 40 pm
  1. Use Privacy - Camera Usage Description for the key. Type the reason your app will use the camera as the value.
Screenshot 2024-02-02 at 3 46 30 pm

Live Camera View

Utilise PKCameraView

PKCameraView()
Full Example
import SwiftUI
import PrototypeKit

struct ContentView: View {
    var body: some View {
        VStack {
            PKCameraView()
        }
        .padding()
    }
}

Live Image Classification

  1. Required Step: Drag in your Create ML / Core ML model into Xcode.
  2. Change FruitClassifier below to the name of your Model.
  3. You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)

Utilise ImageClassifierView

ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
                    latestPrediction: $latestPrediction)
Full Example
import SwiftUI
import PrototypeKit

struct ImageClassifierViewSample: View {
    
    @State var latestPrediction: String = ""
    
    var body: some View {
        VStack {
            ImageClassifierView(modelURL: FruitClassifier.urlOfModelInThisBundle,
                                latestPrediction: $latestPrediction)
            Text(latestPrediction)
        }
    }
}

Live Text Recognition

Utilise LiveTextRecognizerView

LiveTextRecognizerView(detectedText: $detectedText)
Full Example
import SwiftUI
import PrototypeKit

struct TextRecognizerView: View {
    
    @State var detectedText: [String] = []
    
    var body: some View {
        VStack {
            LiveTextRecognizerView(detectedText: $detectedText)
            
            ScrollView {
                ForEach(Array(detectedText.enumerated()), id: \.offset) { line, text in
                    Text(text)
                }
            }
        }
    }
}

Live Hand Pose Classification

  1. Required Step: Drag in your Create ML / Core ML model into Xcode.
  2. Change HandPoseClassifier below to the name of your Model.
  3. You can use latestPrediction as you would any other state variable (i.e refer to other views such as Slider)

Utilise HandPoseClassifierView

HandPoseClassifierView(modelURL: HandPoseClassifier.urlOfModelInThisBundle,
                                 latestPrediction: $latestPrediction)
Full Example
import SwiftUI
import PrototypeKit

struct HandPoseClassifierViewSample: View {
    
    @State var latestPrediction: String = ""
    
    var body: some View {
        VStack {
            HandPoseClassifierView(modelURL: HandPoseClassifier.urlOfModelInThisBundle,
                                   latestPrediction: $latestPrediction)
            Text(latestPrediction)
        }
    }
}

Live Sound Classification (System Sound Classifier)

This model uses the system sound classifier, and does not currently support custom Sound Classifier Models.

  1. You can use recognizedSound as you would any other state variable (i.e refer to other views such as Slider)

Utilise recognizeSounds modifier

.recognizeSounds(recognizedSound: $recognizedSound)
Full Example
import SwiftUI
import PrototypeKit
struct SoundAnalyzerSampleView: View {
    @State var recognizedSound: String?
    
    var body: some View {
        VStack {
            Text(recognizedSound ?? "No Sound")
        }
        .padding()
        .navigationTitle("Sound Recogniser Sample")
        .recognizeSounds(recognizedSound: $recognizedSound)
    }
}

FAQs

Is this production ready?
no.