My first VST, a binaural spatialization engine based on research conducted at LIMSI and the Dyson School of Design Engineering (Imperial College of London) labs. Developments based on the JUCE framework.
Point-cloud recording based on Kinect2 (based on libfreenect2). Real-time rendering in the blender game engine based on a glsl shader by Dalai Felinto (github), part of the Echo Project, devs directed by Brian F.G. Katz at LIMSI-CNRS. Original music, courtesy of Jeremie Poirier-Quinot (Youtube channel).
Still based on point-cloud recording, but with a calibrated room acoustic model. See it as a VR theatre, where one can assist to a given show in different theatre rooms (modifying both visual / acoustic rendering in real-time). (project website).
This mini-game was born from an evening of idleness at home with my brother. Me finalizing a Blender project, him working on a music piece. After a coffee break we decided to dedicate the remaining hours of the day to what became “The Blob” (ok, some would say it does not deserve a name at that stage). More music @ Satin Coco’s Sound Cloud.