Workshop "p5js Body as Interface"
Processing Community Day Porto 2024
The workshop explored experimental human-computer interactions using the body as an interface. Participants were introduced to the machine learning library ML5 within p5js. The workshop aproached basic examples of using hands, body, and face as interface, and then explore how the augmented body can be used as a tool for audiovisual creation. (February 2024)
The folder "Body_As_Interface_p5jsCode" has all the code examples explored in the workshop (Handpose, Blazepose, and Facemesh models). Found them also here: Collections of sketches on p5js editor used on the Workshop
The examples started by "00" are the original examples from ml5 used as starting points. The complete ml5 examples collection can be found here
The PDF "WS_PCD_BodyAsInterface_examplesCode.pdf" covers step by step the code examples. The PDF "WS_PCD_BodyAsInterface_ShortHistoricalBackground.pdf" has a short historical on using the body as interface
For more check references:
.Sound-Vision-Movement blog;
."S+V+M: Relationships between the Sound, Visual and Movement domains in Interactive Systems" PhD thesis;
.Golan Levin Expanded Body lecture;
.MIT Future Sketches Group - Zach Lieberman.
BONUS: For p5js begginers check [Intro to p5js examples](https://editor.p5js.org/visiophone/collections/5qzfN2EY0)