/mobile-sensing-labs

Labs for CSE7323: Mobile Apps for Sensing & Machine Learning

Primary LanguageSwift

Mobile Apps for Sensing & Learning

By Jake Carlson and Eric Smith

Labs for CSE7323: Mobile Apps for Sensing & Learning

Final Project - CoreML, ARKit, and SwiftOCR

Our final project turns whiteboard code into the real thing! Utilizing SwiftOCR, we parse and interpret Python code as it's written on a board. That code is then sent to a server where a Docker container is created and the code is evaluated. The result is then showed to the user in real time.

Project

Lab 4 - OpenCV and Camera

This lab is a little weird - we made use of the iPhone's cameras to do some fun things. The first half of the app is able to detect a face, determine which eyes someone has open, and even tell if the person is smiling! The other part of the app uses the phone's camera to measure heart rate. Just put your finger over the rear camera, and your HR is graphed live on the screen.

Lab4 Screenshot

Lab 3 - Core Motion and SpriteKit

This app tracks your walking in the background and rewards you with a game! If you pass your step goal for the day, you get to play Asteroid Dodge: a game where you guide a spaceship through an asteroid field.

Lab3 Screenshot

Lab 2 - Audio Filtering, the FFT, and Doppler Shifts

In this lab, we use a fast Fourier transform to calculate the frequencies of highest magnitude being picked up by the iPhone's microphone. We also make use of Doppler shifts to determine if someone is moving their hand towards or away from the microphone.

Lab2 Screenshot

Lab 1 - BuildMe

This app helps users track their workout progress.

Lab1 Screenshot