Musishape

Medium: 

VR Experience

Role: 

VR UX Co-ideator/Designer/Engineer & Sound Designer

Tools: 

For: 

Master's Hedonomic VR Design Course

Year: 

2021

Collaborator(s): 

Meijie Hu

Process

{{sect1}}

Overview

I collaborated with Meijie Hu on the design and development of an intuitive, spatial interface for music composition (a kind of embodied digital audio workstation) in VR with hand-tracking input.

{{sect2}}

Inspiration

Traditionally, electronic music producers add notes to and adjust effects in their tracks through 2D interfaces within digital audio workstations (DAWs) or through a MIDI controller (usually consisting of piano keys and some knobs). We wanted to explore new kinds of DAW input interfaces for synthesizing electronic sounds in more organic and discoverable ways, reconnecting to and imaginatively extending beyond styles of physical interaction with analog instruments.

2D Interface of the Reverb Audio Effect's Parameters in the Ableton Live DAW

{{sect3}}

Approach

With the Meta Quest 2's high-quality hand- and finger-tracking input features, we were able to design a way of navigating menus and interacting with 3D objects that feels almost as natural as interfacing with a guitar or drum.

We worked with open-source Unity patches that send MIDI note values and effect parameter values (such as percentage of reverb) to the DAW Ableton Live, which in turn synthesizes sounds and applies effects, bringing the VR instrument interactions to life.

Ableton patch that receives MIDI notes sent from a Unity script

Ableton patch that receives effect parameter "Values" sent from a Unity script

{{sect4}}

Outcome

Participants can fluidly open and select from an inventory of 3D instruments expanding from their wrist. Once placed, the instruments' surface volumes invite the participant to explore tapping, pulling, and other gestural interactions that generate sounds and sound effects.

{{sect5}}

Demo

A proof-of-concept demo of opening the wrist instrument menu and selecting instruments with hand- and finger-tracking:

Wrist Instrument UI Demo

A proof-of-concept demo of playing a basic cube synth instrument by tapping its side like a drum, while modulating the reverb with a smaller cube "spring joint" hanging from the the cube's right side:

Experimenting with more organic instrument surfaces:

Finger-tracking for Innovative Instrument Interactions

{{sect6}}

User Flow

user flow diagram

{{sect7}}

Tech Stack

tech stack diagram

{{sect8}}

Slides from Class Presentation

Process

{{sect1}}

Overview

I collaborated with Meijie Hu on the design and development of an intuitive, spatial interface for music composition (a kind of embodied digital audio workstation) in VR with hand-tracking input.

{{sect2}}

Inspiration

Traditionally, electronic music producers add notes to and adjust effects in their tracks through 2D interfaces within digital audio workstations (DAWs) or through a MIDI controller (usually consisting of piano keys and some knobs). We wanted to explore new kinds of DAW input interfaces for synthesizing electronic sounds in more organic and discoverable ways, reconnecting to and imaginatively extending beyond styles of physical interaction with analog instruments.

2D Interface of the Reverb Audio Effect's Parameters in the Ableton Live DAW

{{sect3}}

Approach

With the Meta Quest 2's high-quality hand- and finger-tracking input features, we were able to design a way of navigating menus and interacting with 3D objects that feels almost as natural as interfacing with a guitar or drum.

We worked with open-source Unity patches that send MIDI note values and effect parameter values (such as percentage of reverb) to the DAW Ableton Live, which in turn synthesizes sounds and applies effects, bringing the VR instrument interactions to life.

Ableton patch that receives MIDI notes sent from a Unity script

Ableton patch that receives effect parameter "Values" sent from a Unity script

{{sect4}}

Outcome

Participants can fluidly open and select from an inventory of 3D instruments expanding from their wrist. Once placed, the instruments' surface volumes invite the participant to explore tapping, pulling, and other gestural interactions that generate sounds and sound effects.

{{sect5}}

Demo

A proof-of-concept demo of opening the wrist instrument menu and selecting instruments with hand- and finger-tracking:

Wrist Instrument UI Demo

A proof-of-concept demo of playing a basic cube synth instrument by tapping its side like a drum, while modulating the reverb with a smaller cube "spring joint" hanging from the the cube's right side:

Experimenting with more organic instrument surfaces:

Finger-tracking for Innovative Instrument Interactions

{{sect6}}

User Flow

user flow diagram

{{sect7}}

Tech Stack

tech stack diagram

{{sect8}}

Slides from Class Presentation

Outcome

Other work

Want to create something awesome? Drop me an email.

→ Hi@email.com