Interacting Automultiscopic 3D with Haptic Paint Brush in Immersive Room
We propose an interactive artwork system with automultiscopic 3D and haptic paintbrush in an immersive room. Our system consists of an 81-view automultiscopic display, a handheld haptic paintbrush, and a large-scale color palette station in a CAVE-like cubic room filled with the visual scene of the artwork. The 81-view rendering and multiplexing technology are applied by setting up the virtual cameras in the off-axis layout. The haptic paintbrush is designed and implemented using a 2D array of multiple piezoelectric actuators. It provides the tactile feedback of spatial distance information between a virtual brush and a distal 3D object displayed on the automultiscopic display for the precise control of the brush when it is interacting with automultiscopic 3D. We demonstrate a proof-of-concept system that integrates a classic artwork into an innovative interactive system using novel multimedia and interactive technologies and evaluate our system using the handheld haptic brush for its performance and usability to enhance the user experiences. [Paper]
Artwork Painting Room System
Our system is installed inside a room where its walls are covered with the artwork using three projectors, one for each wall (see Figure 2, projectors are not seen in this figure). The size of the room is 234 cm (width) × 400 cm (length), and the height of the room is 223 cm. Inside the room, a largescale color palette station is placed for the color selection. A haptic brush is also installed next to the palette station. An 81-view automultiscopic display is embedded onto the front wall which is located in front of the palette station. A user is allowed to hold the brush from the palette station and start selecting the color and painting in the air while perceiving realistic 3D through the automultiscopic display in an immersive room. After completing the painting task, the artwork image projected onto the wall starts animating itself to enhance the user’s visual experiences.
The artwork painting room platform consists of Main Server, Multi-view Rendering Server, Automultiscopic Display, Color Palette Station, Haptic Feedback System, and Haptic Paint Brush. In this subsection, we will discuss how each component is connected together and communicates with each other. We will also discuss the role of each server and the entire pipeline in detail.
Haptic Interaction with Handheld Brush
We designed and implemented a handheld haptic paintbrush for interacting with automultiscopic 3D objects. Our approach is to adopt spatial gesture of air painting by holding and moving the handheld brush in 3D direction with haptic feedback. Since the nature of automultiscopic display requires a certain viewer’s distance between the display and the user’s eyes in order to fully enjoy the glasses-free 3D content (2.5 m in our system), the paintbrush with haptic feedback is a natural choice for our system. In this section, we discuss how haptic feedback system and haptic paintbrush are designed and implemented, and how haptic feedback is
provided when the user paints on a 3D object in the air.
A. HAPTIC PAINT BRUSH
The haptic paint brush consists of three parts: head, button, and handle (see Figure 7(a)). The brush head has a convex shape to articulate a paint brush bristle. We used semitransparent plastic material for the brush head so that it can transmit the light from the color LED in the brush. The round button is located right above the handle so that it can be reached and pressed by a thumb when the user’s hand is holding the handle of the brush. The shape of handle is a pentagonal cylinder and the handle itself is covered with rubber-like material. Inside the cover, the handle contains 5 × 5 piezoelectric (piezo for short) actuators in order to deliver haptic feedback to the user’s palm. Details of haptic feedback will be discussed in the next subsection. The total height of brush is 280 mm and its inner circumference is 160 mm.
Inside the brush, a control circuit is placed in the middle. The circuit contains a micro-controller unit (STM32F407, Core: ARM Cortex-M4 processor with FPU, frequency up to 168 MHz, 210 DMIPS/1.25 DMIPS/MHz, and DSP instructions, STMicroelectronics, Switzerland), a color LED
(LS5050RGB, Emitting color: RGB, China), an Ethernet network adapter, and a push-button. The 5 × 5 piezo actuators were mounted on the outer surface of the handle of the brush
(then covered with rubber-like material) and they are not part of the circuit board in the brush. Instead, the actuators are directly connected to the haptic feedback system. The microcontroller unit (MCU) controls all the components in the circuit board. The color LED is used to indicate which color the brush is using. The basic operation of the brush is that it continuously sends out the packets that contain a timestamp and whether the button is pressed to the main server. The brush also receives a packet from the server regarding the selected color in order to change the color of the brush tip.
B. HAPTIC FEEDBACK SYSTEM
As shown in Figure 7(b), a haptic feedback system consists of a MCU (STM32F407), a two-channel DAC module (MCP 4902, Dual 8-bit Voltage Output DAC, Microchip Technology Inc, USA), piezo amplifiers (PDu100B, PiezoDrive, Australia), reed relays (DIP05-2A72-21D, Standex-Meder
electronics, USA), and piezo discs (7BB-20-6, resonant frequency: 6.3kHz, capacitance: 10nF, plate size: 20 mm, element size: 14 mm, plate material: brass, Murata, Japan). As we mentioned in the earlier section, a 2D array of
5 × 5 piezo actuators are mounted around the handle of the brush and connected to this haptic feedback system. The rest of the components are implemented within the haptic feedback system.
The MCU contains haptic waveform pattern information. Based on the command packets received from the main server, MCU informs the DAC module through SPI to generate waveform signals based on the pattern information. The signals are converted into analog signals by the DAC
module and delivered to the piezo amplifiers. Based on the operation mode (either row or column mode), the amplified signals are delivered to the piezo actuators via relays. When the brush is moving in a 3D space as if the users paint with the paint brush in the air, haptic feedback is generated on the actuators that are mounted around the handle of the brush. In this way, the users can feel the haptic feedback throughout their palm of the holding hand, providing the information of the painted area that is mapped to the brush location in the 3D world coordinate. The brush can render different feelings by changing the amplitude, frequency, envelope, and duration of
haptic feedback signal that is mapped to each row (or each column) of piezo actuators.
C. HAPTIC INTERACTION WITH AUTOMULTISCOPIC 3D
Depth perception and motion parallax are well visualized and presented using automultiscopic display as compared to the ordinary display. However, it is difficult to precisely interact with a 3D object using a handheld device since the 3D object is displayed away from the user and thus visual cue of 3D depth is strictly limited, given the fact that it requires a certain viewing distance to visualize automultiscopic contents. For this reason, we provide 3D depth information of an object through the brush with haptic feedback.