While I was a master’s student at NYU’s ITP, I contributed a tiny bit to Dr. Ken Perlin’s Chalktalk.
Chalktalk is a tool that Dr. Perlin uses in his computer graphics course instead of a chalkboard. It is a web-based environment where gestures create interactive code “sketches” that can interface with each other to illustrate complex concepts. The project is currently private, but will hopefully be released to the public at some point.
I attended weekly meetings where we held high level discussions on potential applications for Chalktalk, and lower level discussions on how best to structure the codebase.
My primary contribution was a feature allowing for the upload and playback of audio files. Though Chalktalk is a web-based tool, I could not simply rely on the Web Audio API because it obscures some of the underlying signal processing. We needed to create our own signal processing chain where every audio buffer is computed as the output of one or many functions.