AirPiano: Enhancing Music Playing Experience in Virtual Reality with Mid-air Haptic Feedback
We present AirPiano, an enhanced music playing system to provide touchable experiences in HMD-based virtual reality with mid-air haptic feedback. AirPiano allows users to enjoy enriched virtual piano-playing experiences with touchable keys in the air. We implement two haptic rendering schemes to mimic the resisting force of piano keys using ultrasonic vibrations. Constant Feedback is designed to provide short and intense feedback whereas Adaptive Feedback is designed to follow the changes in feedback from a real keypress. A user study is conducted using three simple musical pieces. Results show that adding mid-air haptic rendering can significantly improve the user experience on the virtual piano. Adaptive Feedback further increases clarity, reality, enjoyment, and user satisfaction, when compared to circumstances in which no haptic feedback is being generated. Our design of AirPiano can shed light on the introduction of a touchable musical playing system in virtual reality with mid-air haptic feedback. [Paper]
We used a HMD device (1080 x 1200 @ 90 Hz per eye, Oculus Rift CV1) and a mid-air haptic display. The visual appearance of AirPiano and its surrounding environment were put together with Unity3D game engine. AirPiano was created as a simplified piano through a real-scaled 3D modeling; to have ten white keys and seven black keys, each with a stroke depth of 13.3 mm. The number of keys was decided in consideration of the optimum workspace of the mid-air haptic display (about 0.4m x 0.4m x 0.4 m). In the virtual scene, AirPiano was placed on a desk inside a furnished virtual room, so that the user can naturally play the piano through the binocular displays in HMD. The tracking mechanism of the external infrared camera for the HMD allows users to view the entire scene at any angle. Rigged 3D human hand meshes were also placed within the scene to mirror the movements of the user while visible through the HMD. The position and pose of the user’s hands were tracked through a miniature infrared tracker of Leap Motion, which was placed on near-side edge of the haptic display. The field of vision this tracker possesses is approximately one meter range in any direction of its 150 degree viewing angle. The static position accuracy of the tracker is known to be about 0.5mm while the dynamic tracking accuracy is worse and degraded depending on the shape or pose of the user’s hands and the amount of created occlusions.
The mid-air haptic display consists of a 16 by 16 planar array of transducers which emit 40 kHz ultrasound in upward direction with a width of 80 degrees. The haptic display was placed on the physical table where its height is conformed with the position of AirPiano to deliver tactile feedback of keypress action in virtual reality. The phase of ultrasound signal was decided for each transducer in the haptic display to generate one or more focus points. The 40 kHz of ultrasound waves can translate into faint constant pressure (5 mN in maximum), and amplitude modulation of ultrasonic signals can transfer stronger stimuli. Our initial observation showed the perceived intensity is maximized when the modulation frequency is around 200 Hz. However, since higher modulation frequency can generate audible noise that can degrade user experience, we decided to set 140 Hz of ultrasound signal as a fair tradeoff between intensity and noise level.
Designing Touchable Keys
Since performing on a piano is multimodal musical activity, AirPiano is specifically designed to orchestrate visual, tactile, and auditory modalities during keypressing actions. The collisions between the fingers of the hand model and AirPiano’s virtual keys were detected through Unity game engine at 120 Hz. Whenever a collision is detected on a key, the key moved down to remove overlap between the finger and the key. A collision from the other sides of a key was ignored. Simultaneous contact between multiple fingers and keys can also be handled in real-time.
We designed two methods of haptic rendering for keypress feedback: Constant Feedback and Adaptive Feedback. Above figure represents the feedback intensities of the two methods for a keypress. Constant Feedback is designed to provide clear and confident feedback for short and intense keypressing actions. In Constant Feedback method, the haptic feedback is provided at its maximum intensity, and is delivered during the entire time the key is being pressed. Adaptive Feedback is designed to follow the changes in feedback of keypressing actions thereby simulating more realistic behavior of piano keys resulting in an enriched experience. In the keypress mechanism of a real piano, the stiff surface of a key can be felt when the fingertip started to touch a key. During a keypressing action, the weight of the key creates upward kinesthetic resistive feedback. When the key reaches its limit, higher stiffness and vibration are felt with the piano sound.
In Adaptive Feedback, a short decaying vibration is rendered immediately after a finger makes contact on a key. The initial intensity is selected from a range of 50–100% of maximum intensity, in proportion to the speed the fingertip hits the key. The intensity decays linearly over 200 ms until it reaches a lower level of 50% (of maximum intensity). In this way, the user can feel varying levels of vibration depending on the speed with which their fingers strike AirPiano’s keyboard. During the key travel, the vibration intensity is fixed at 50%. When the key hits the bottom, vibration intensity is increased to 100%, to communicate higher stiffness. As the key moves back upwards to its normal position, the feedback intensity changes to the 50% level again. The intensity level and the pressure on a focal point was in a linear relationship with a little offset. Since the emitting power of the haptic display was fully utilized when there were one or two contact points, simultaneous pressing of more than two keys contributed to degradation of individual haptic stimuli.
Auditory Feedback in AirPiano was designed to match the visual and tactile cues of virtual keypressing actions. We applied the same mechanism that a real-world piano demonstrates when its keys are pressed. Despite the continual changes in visual and tactile responses during a keypressing action, sounds in AirPiano are only produced when a key reaches its limit and hits the bottom, at above a certain speed. We used the grand piano sounds from SimplePiano (by Doug Cox) for our AirPiano. The fade-out duration of a sound was set to be proportional to the length of time the key sticks to the bottom. There was no noticeable asynchronous delay among the visual, haptic and auditory feedback in AirPiano.