I am an Assistant Professor in the Department of Computer Science and a Director of the Multimodal Interaction Lab (MI Lab) at the University of Texas at Dallas. My research strives to create novel haptics interactions to tackle technical and human factors' challenges to amplify human satisfaction through enriched user experience. My research goal is to establish an interdisciplinary research collaboration program to solve real-world challenges in haptics interaction and address issues in human perception and multimodalities in immersive environments. I received my Ph.D. in Electrical and Computer Engineering (Advisor: Dr. Hong Z. Tan) and an M.S. in Computer Science, both from Purdue University. I also received M.S. and B.S. in Electrical and Computer Engineering from Hanyang University, Seoul, Korea. Prior to the current position, I was a Staff Researcher at Alibaba Group in Sunnyvale, California, and a Senior Researcher at the Electronics and Telecommunications Research Institute (ETRI). I also worked at Microsoft Research Asia (Beijing, China), Samsung Advanced Institute of Technology (Yongin, Korea), and NHK Science and Technology Research Laboratories (Tokyo, Japan) as a research intern (CV).
MetaTwin is a collaborative Metaverse platform that supports one-to-one spatiotemporal synchrony between physical and virtual spaces. The users can interact with other users and surrounding IoT devices without being tied to physical spaces. Resource sharing is implemented to allow users to share media, including presentation slides and music. We deploy MetaTwin in two different network environments (i.e., within the US, Korea-US international) and summarize users’ feedback about the experience. [Paper | Video 1 | Video 2]
We design a proof-of-concept thermo-tactile feedback system with an open-top chamber, heat modules, and an ultrasound display. Our approach is to provide heated airflow along the path to the focused pressure point created from the ultrasound display to generate thermal and vibrotactile cues in mid-air simultaneously. [Paper | Video]
We build an interactive 3D data visualization tool that adapts hand gestures and mid-air haptics to provide tangible interaction in VR using ultrasound haptic feedback on 3Ddata visualization. We consider two types of 3D visualization datasets and provide different data encoding methods for haptic representations. [Paper 1 | Paper 2 | Video 1 | Video 2]
We empirically explore fundamental requirements for achieving VR in-air typing by observing the unconstrained eyes-free in-air typing of touch typists. We examine properties of finger kinematics, correlated movement of fingers, interrelation in consecutive key-strokes, and 3D distribution of key-stroke movements. We further test finger kinematic features, including 3D position, velocity, acceleration, and temporal features, including previous fingers and keys. Based on this analysis, we assess the performance of various classifiers, including Naive Bayes, Random Forest, Support Vector Machines, and Deep Neural Networks, in terms of the accuracy for correctly classifying the keystroke. [Paper 1 | Paper 2 | Video]
Refinity is an interactive holographic signage for the new retail shopping experience. In this project, we show a concept of futuristic shopping experience with a tangible 3D mid-air interface that allows customers to directly select and explore realistic virtual products using autostereoscopic 3D display combined with mid-air haptics and finger tracker. We also present an example of in-store shopping scenario for natural interactions with 3D. This shopping experience will engage users in producing a memorable in-store experience with the merging of digital and physical interactions. [Paper | Video]
RealWalk is a pair of haptic shoes for HMD-based VR, designed to create realistic sensations of ground surface deformation and texture through MR fluid actuators. RealWalk offers a novel interaction scheme through the physical interaction between the shoes and the ground surfaces while walking in VR. Each shoe consists of two MR fluid actuators, an insole pressure sensor, and a foot position tracker. When a user steps on the ground with the shoes, the two MR fluid actuators are depressed, creating a variety of ground material deformation such as snow, mud, and dry sand by changing its viscosity. We build an interactive VR application and compare RealWalk with vibrotactile-based haptic shoes to investigate its effectiveness. [Paper 1 | Paper 2 | Paper 3 | Video]
Touch3D is an interactive mobile platform that provides realistic viewing and touching experiences through glasses-free 3D visualization with electrovibration. Touch3D is designed to take advantage of both visual and tactile illusions to maximize multimodal experience in touchscreen interaction. We seamlessly integrate two technologies: Automultiscopic 3D Display and Electrovibration Display; and weave both hardware and software into one fluid interface. Our museum application using Touch3D demonstrates important implications for the improvement of 3D perception in both visual and tactile modalities for enhanced touchscreen interaction. [Paper | Video]
AirPiano is a music playing system to provide touchable experiences in HMD-based virtual reality with mid-air haptic feedback. AirPiano allows users to enjoy enriched virtual piano-playing experiences with touchable keys in the air. We implement two haptic rendering schemes to mimic the resisting force of piano keys using ultrasonic vibrations. Constant Feedback is designed to provide short and intense feedback whereas Adaptive Feedback is designed to follow the changes in feedback from a real keypress. [Paper | Video]
We propose an interactive artwork system with automultiscopic 3D and haptic paint brush in an immersive room. Our system consists of a 81-view automultiscopic display, a handheld haptic paint brush, and a large-scale color palette station in a CAVE-like cubic room filled with the visual scene of the artwork. The 81-view rendering and multiplexing technology is applied by setting up the virtual cameras in the off-axis layout. The haptic paint brush is designed and implemented using a 2D array of multiple piezoelectric actuators. It provides the tactile feedback of spatial distance information between a virtual brush and a distal 3D object displayed on the automultiscopic display for the precise control of the brush when it is interacting with automultiscopic 3D. [Paper]