Music in Motion was a student-pitched project aiming to explore the relationship of physical music to sound creation in virtual reality. Over fifteen weeks, we explored around twenty prototypes in interactions, graphics, and sound to try to create an experience that would get the average person with no musical experience to feel like they have intuitive musical control in our environment.
Created a render texture-based water simulation system that can handle many objects of different sizes and velocities entering and exiting the surface simultaneously.
Implemented and upgraded several tools aimed at increasing prototype speed, including the sound-interaction "linkage" system originally developed by Yujin Ariza that allowed quick pairing and tweaking of user interactions with our procedural sound generation in SuperCollider.
NOTE: The game was originally designed for our custom 12-speaker setup using SuperCollider, which makes it somewhat fragile to port. If you have trouble running it but would still like to give it a try, shoot me an email!