{{sect1}}
I collaborated with Meijie Hu on the design and development of an intuitive, spatial interface for music composition (a kind of embodied digital audio workstation) in VR with hand-tracking input.
{{sect2}}
Traditionally, electronic music producers add notes to and adjust effects in their tracks through 2D interfaces within digital audio workstations (DAWs) or through a MIDI controller (usually consisting of piano keys and some knobs). We wanted to explore new kinds of DAW input interfaces for synthesizing electronic sounds in more organic and discoverable ways, reconnecting to and imaginatively extending beyond styles of physical interaction with analog instruments.
{{sect3}}
With the Meta Quest 2's high-quality hand- and finger-tracking input features, we were able to design a way of navigating menus and interacting with 3D objects that feels almost as natural as interfacing with a guitar or drum.
We worked with open-source Unity patches that send MIDI note values and effect parameter values (such as percentage of reverb) to the DAW Ableton Live, which in turn synthesizes sounds and applies effects, bringing the VR instrument interactions to life.
{{sect4}}
Participants can fluidly open and select from an inventory of 3D instruments expanding from their wrist. Once placed, the instruments' surface volumes invite the participant to explore tapping, pulling, and other gestural interactions that generate sounds and sound effects.
{{sect5}}
A proof-of-concept demo of opening the wrist instrument menu and selecting instruments with hand- and finger-tracking:
A proof-of-concept demo of playing a basic cube synth instrument by tapping its side like a drum, while modulating the reverb with a smaller cube "spring joint" hanging from the the cube's right side:
Experimenting with more organic instrument surfaces:
{{sect6}}
{{sect7}}
{{sect8}}
{{sect1}}
I collaborated with Meijie Hu on the design and development of an intuitive, spatial interface for music composition (a kind of embodied digital audio workstation) in VR with hand-tracking input.
{{sect2}}
Traditionally, electronic music producers add notes to and adjust effects in their tracks through 2D interfaces within digital audio workstations (DAWs) or through a MIDI controller (usually consisting of piano keys and some knobs). We wanted to explore new kinds of DAW input interfaces for synthesizing electronic sounds in more organic and discoverable ways, reconnecting to and imaginatively extending beyond styles of physical interaction with analog instruments.
{{sect3}}
With the Meta Quest 2's high-quality hand- and finger-tracking input features, we were able to design a way of navigating menus and interacting with 3D objects that feels almost as natural as interfacing with a guitar or drum.
We worked with open-source Unity patches that send MIDI note values and effect parameter values (such as percentage of reverb) to the DAW Ableton Live, which in turn synthesizes sounds and applies effects, bringing the VR instrument interactions to life.
{{sect4}}
Participants can fluidly open and select from an inventory of 3D instruments expanding from their wrist. Once placed, the instruments' surface volumes invite the participant to explore tapping, pulling, and other gestural interactions that generate sounds and sound effects.
{{sect5}}
A proof-of-concept demo of opening the wrist instrument menu and selecting instruments with hand- and finger-tracking:
A proof-of-concept demo of playing a basic cube synth instrument by tapping its side like a drum, while modulating the reverb with a smaller cube "spring joint" hanging from the the cube's right side:
Experimenting with more organic instrument surfaces:
{{sect6}}
{{sect7}}
{{sect8}}