About me

I am an Assistant Professor in the Department of Computer Science and a Director of the Multimodal Interaction Lab (MI Lab, https://mi-lab.io) at the University of Texas at Dallas. My research strives to create novel haptics interactions to tackle technical and human factors' challenges to amplify human satisfaction through enriched user experience. My research goal is to establish an interdisciplinary research collaboration program to solve real-world challenges in haptics interaction and address issues in human perception and multimodalities in immersive environments.

I received my Ph.D. in Electrical and Computer Engineering (Advisor: Dr. Hong Z. Tan) and an M.S. in Computer Science, both from Purdue University. I also received M.S. and B.S. in Electrical and Computer Engineering from Hanyang University, Seoul, Korea. Prior to the current position, I was a Staff Researcher at Alibaba Group in Sunnyvale, California, and a Senior Researcher at the Electronics and Telecommunications Research Institute (ETRI). I also worked at Microsoft Research Asia (Beijing, China), Samsung Advanced Institute of Technology (Yongin, Korea), and NHK Science and Technology Research Laboratories (Tokyo, Japan) as a research intern (CV).

Research Area











2003 – 2016


Let It Snow: Designing Snowfall Experience in VR

We present Snow, a cross-modal interface that integrates cold and tactile stimuli in mid-air to create snowflakes and raindrops for VR experiences. Snow uses six Peltier packs and an ultrasound haptic display to create unique cold-tactile sensations for users to experience catching snowflakes and getting rained on their bare hands. Our approach considers humans’ ability to identify tactile and cold stimuli without masking each other when projected onto the same location on their skin, making illusions of snowflakes and raindrops. We design both visual and haptic renderings to be tightly coupled to present snow melting and rain droplets for realistic visuo-tactile experiences. For multiple snowflakes and raindrops rendering, we propose an aggregated haptic scheme to simulate heavy snowfall and rainfall environments with many visual particles. The results show that the aggregated haptic rendering scheme demonstrates a more realistic experience than other schemes. We also confirm that our approach of providing cold-tactile cues enhances the user experiences in both conditions compared to other modality conditions. [Video]

Fabric Thermal Display using Ultrasonic Waves

This paper presents a fabric-based thermal display of a polyester fabric material combined with thermally-conductive materials using an ultrasound haptic display. We first empirically test the thermal generation process in five fabric materials by applying 40 kHz ultrasonic waves to the fabric materials. We also examine their thermal characteristics by applying different frequencies and amplitudes of ultrasonic cues. We show that polyester demonstrates the best thermal performance. We then combine it with thermally-conductive materials, including copper and aluminum, and compare them with the fabric-only condition. We integrate polyester with aluminum into a glove to explore the use cases in VR and share our findings, insights, limitations, and future works. [Paper | Video]

Upper Body Thermal Referral and Tactile Masking for Localized Feedback

This paper investigates the effects of thermal referral and tactile masking illusions to achieve localized thermal feedback on the upper body. We use a 2D array of sixteen vibrotactile actuators (4×4) with four thermal actuators to explore the thermal distribution on the user’s back. A combination of thermal and tactile sensations is delivered to establish the distributions of thermal referral illusions with different numbers of vibrotactile cues. The result confirms that localized thermal feedback can be achieved through cross-modal thermo-tactile interaction on the user’s back of the body. [Paper | Video]

MetaTwin: Synchronizing Physical and Virtual Spaces for Seamless World

MetaTwin is a collaborative Metaverse platform that supports one-to-one spatiotemporal synchrony between physical and virtual spaces. The users can interact with other users and surrounding IoT devices without being tied to physical spaces. Resource sharing is implemented to allow users to share media, including presentation slides and music. We deploy MetaTwin in two different network environments (i.e., within the US, Korea-US international) and summarize users’ feedback about the experience. [Paper | Video 1 | Video 2]

Mid-Air Thermo-Tactile Interaction in VR

We design a proof-of-concept thermo-tactile feedback system with an open-top chamber, heat modules, and an ultrasound display. Our approach is to provide heated airflow along the path to the focused pressure point created from the ultrasound display to generate thermal and vibrotactile cues in mid-air simultaneously. [Paper | Video]


We build an interactive 3D data visualization tool that adapts hand gestures and mid-air haptics to provide tangible interaction in VR using ultrasound haptic feedback on 3Ddata visualization. We consider two types of 3D visualization datasets and provide different data encoding methods for haptic representations. [Paper 1 | Paper 2 | Video 1 | Video 2]

In-Air Text Input Technique

We empirically explore fundamental requirements for achieving VR in-air typing by observing the unconstrained eyes-free in-air typing of touch typists. We examine properties of finger kinematics, correlated movement of fingers, interrelation in consecutive key-strokes, and 3D distribution of key-stroke movements. We further test finger kinematic features, including 3D position, velocity, acceleration, and temporal features, including previous fingers and keys. Based on this analysis, we assess the performance of various classifiers, including Naive Bayes, Random Forest, Support Vector Machines, and Deep Neural Networks, in terms of the accuracy for correctly classifying the keystroke. [Paper 1 | Paper 2 | Video]


Refinity is an interactive holographic signage for the new retail shopping experience. In this project, we show a concept of futuristic shopping experience with a tangible 3D mid-air interface that allows customers to directly select and explore realistic virtual products using autostereoscopic 3D display combined with mid-air haptics and finger tracker. We also present an example of in-store shopping scenario for natural interactions with 3D. This shopping experience will engage users in producing a memorable in-store experience with the merging of digital and physical interactions. [Paper | Video]


RealWalk is a pair of haptic shoes for HMD-based VR, designed to create realistic sensations of ground surface deformation and texture through MR fluid actuators. RealWalk offers a novel interaction scheme through the physical interaction between the shoes and the ground surfaces while walking in VR. Each shoe consists of two MR fluid actuators, an insole pressure sensor, and a foot position tracker. When a user steps on the ground with the shoes, the two MR fluid actuators are depressed, creating a variety of ground material deformation such as snow, mud, and dry sand by changing its viscosity. We build an interactive VR application and compare RealWalk with vibrotactile-based haptic shoes to investigate its effectiveness. [Paper 1 | Paper 2 | Paper 3 | Video]


Touch3D is an interactive mobile platform that provides realistic viewing and touching experiences through glasses-free 3D visualization with electrovibration. Touch3D is designed to take advantage of both visual and tactile illusions to maximize multimodal experience in touchscreen interaction. We seamlessly integrate two technologies: Automultiscopic 3D Display and Electrovibration Display; and weave both hardware and software into one fluid interface. Our museum application using Touch3D demonstrates important implications for the improvement of 3D perception in both visual and tactile modalities for enhanced touchscreen interaction. [Paper | Video]


AirPiano is a music playing system to provide touchable experiences in HMD-based virtual reality with mid-air haptic feedback. AirPiano allows users to enjoy enriched virtual piano-playing experiences with touchable keys in the air. We implement two haptic rendering schemes to mimic the resisting force of piano keys using ultrasonic vibrations. Constant Feedback is designed to provide short and intense feedback whereas Adaptive Feedback is designed to follow the changes in feedback from a real keypress. [Paper | Video]

Interactive 3D Painting with Haptic Brush in Immersive Room

We propose an interactive artwork system with automultiscopic 3D and haptic paint brush in an immersive room. Our system consists of a 81-view automultiscopic display, a handheld haptic paint brush, and a large-scale color palette station in a CAVE-like cubic room filled with the visual scene of the artwork. The 81-view rendering and multiplexing technology is applied by setting up the virtual cameras in the off-axis layout. The haptic paint brush is designed and implemented using a 2D array of multiple piezoelectric actuators. It provides the tactile feedback of spatial distance information between a virtual brush and a distal 3D object displayed on the automultiscopic display for the precise control of the brush when it is interacting with automultiscopic 3D. [Paper]