Use Of Haptics For The Enhanced Musuem Website-Usc Use of Haptics for the Enhanced Musuem Website-USC Interactive Art Museum Our mission for the Enhanced Museum project is to explore new technologies for the exhibition of three-dimensional art objects (Goldberg, Bekey, Akatsuka, and Bressanelli, 1997; McLaughlin, 1998; McLaughlin, Goldberg, Ellison, and Lucas, 1999; McLaughlin and Osborne, 1997; Schertz, Jaskowiak, and McLaughlin, 1997). Although it is not yet commonplace, a few museums are exploring methods for 3D digitization of priceless artifacts and objects from their sculpture and decorative arts collections, making the images available via CD-ROM or in-house kiosks. For example, the Canadian Museum of Civilization has collaborated with Ontario-based Hymarc to use the latter’s ColorScan 3D laser camera to create three-dimensional models of more than fifty objects from the museum’s collection (Canarie, Inc., 1998; Shulman, 1998). A similar partnership has been formed between the Smithsonian Institution and Synthonic Technologies, a Los Angeles-area company. At Florida State University , the Deparment of Classics is working with a team to digitize Etruscan artifacts using the RealScan 3D imaging system from Real 3D (Orlando, Florida), and art historians from Temple University are collaborating with researchers from the Watson Research Laboratory’s visual and geometric computing group to create a model of Michaelangelo’s Pieta with the Virtuoso shape camera from Visual Interface (Shulman, 1998).
In collaboration with our colleagues at USC’s accredited art museum, the Fisher Gallery, our IMSC team is developing an application for the Media Immersion Environment that will not only permit museum visitors to examine and manipulate digitized three-dimensional art objects visually, but will also allow visitors to interact remotely, in real time, with museum staff members to engage in joint tactile exploration of the works of art. Our team believes that the hands-off policies that museums must impose limit appreciation of three-dimensional objects, where full comprehension and understanding rely on the sense of touch as well as vision. Haptic interfaces will allow fuller appreciation of three-dimensional objects without jeopardizing conservation standards. Our goal is to assist museums, research institutes and other conservators of priceless objects in providing the public with a vehicle for object exploration, in a modality that could not otherwise be permitted. Our initial application will be to a wing of the virtual museum focusing on examples of the decorative arts: the Fisher Gallery’s collection of teapots. The collection is comprised of 150 teapots from all over the world. It was a gift to USC in memory of the late Patricia Daugherty Narramore by her husband Roth Narramore.
The Narramores, USC alumni, collected the pots on their many domestic and international journeys. Some items are by local artists, others by artists and makers from other countries, including China, Indonesia, Canada, Japan, Brazil, England, Portugal, Morroco, and Sweden. Materials used to make the pots range from porcelain and clay to wicker and metal. The teapots are ideal candidates for haptic exploration, not only for their varied shapes but also for their unusual textures and surface decoration. Figure 1.
Teapots from the Fisher Gallery’s Narramore Collection Haptics for the Museum Haptics refers to the modality of touch and the associated sensory feedback. Haptics researchers are interested in developing, testing, and refining tactile and force feedback devices that allow users to manipulate and feel virtual objects with respect to such features as shape, temperature, weight and surface texture (Basdogan, Ho, Slater, and Srinavasan, 1998; Bekey, 1996; Burdea, 1996; Brown & Colgate, 1994; Buttolo, Oboe, Hannaford & McNeely, 1996; Dinsmore, Langrana, Burdea, and Ladeji, 1997; Geiss, Evers, & Meinzer, 1998; Ikei, Wakamatsu, & Fukuda, 1997; Liu, Iberall, & Bekey, 1989; Howe, 1994; Howe and Cutkosky, 1993; Mar, Randolph, Finch, van Verth, & Taylor, 1996; Massie, 1996; Millman, 1995; Mor, 1998; Nakamura & Inoue, 1998; Rao, Medioni, Liu, & Bekey, 1988; Srinivasan & Basdogan, 1997; Yamamoto, Ishguro, & Uchikawa, 1993). Haptic acquisition and display devices Researchers have been interested in the potential of force feedback devices such as pen or stylus-based masters, like Sensable’s PHANToM (Massie, 1996; Salisbury, Brock, Massie, Swarup, & Zilles, 1995; Salisbury & Massie, 1994), as alternative or supplemental input devices to the mouse, keyboard, or joystick. The PHANToM is a small, desk-grounded robot that permits simulation of single fingertip contact with virtual objects through a thimble or stylus. It tracks the x, y, and z Cartesian coordinates and pitch, roll and yaw of the virtual probe as it moves about a three-dimensional workspace, and its actuators communicate forces back to the user’s fingertips as it detects collisions with virtual objects, simulating the sense of touch.
The CyberGrasp from Virtual Technologies is an exoskeletal device which fits over a 22 DOF CyberGlove, providing force feedback and vibrotactile contact feedback; it is used in conjunction with a position tracker to measure the position and orientation of the forearm in three-dimensional space. Similar to the CyberGrasp is the Rutgers Master II (Burdea, 1996; Gomez, 1998; Langrana, Burdea, Ladeiji, and Dinsmore, 1997) which has an actuator platform mounted on the palm that gives force feedback to four fingers. Position tracking is done by the Polhmeus Fastrak. Alternative approaches to haptic sensing and discrimination have employed the vibrotactile display, which applies multiple small force vectors to the fingertip. For example, Ikei, Wakamatsu, and Fukuda (1997) used photographs of objects and a contact pin array to transmit tactile sensations of the surface of objects.
Each pin in the array vibrates commensurate with the local intensity (brightness) of the surface area. Image intensity is roughly correlated with the height of texture protrusions. A data glove originating at Sandia (Sandia, 1995) uses rod-like plungers to tap the fingertips lightly to simulate tactile sensations, and a magnetic tracker and strain gauges to follow the movements of the user’s hand and fingers. Howe (1996) notes that vibrations are particularly helpful in certain kinds of sensing tasks, such as assessing surface roughness, or detecting system events (for example, contact and slip in manipulation control). Researchers at the Fraunhofer Institute for Computer Graphics in Darmstadt have developed a glove-like haptic device they call the ThermoPad, a haptic temperature display based on Peltier elements and simple heat transfer models; they are able to simulate not only the environmental temperature but also the sensation of heat or cold one experiences when grasping or colliding with a virtual object. At the University of Tsukuba, Japan, Iwana, Yano, and Hashimoto (1997) are using the HapticMaster, a 6 DOF device with a ball grip that can be replaced by various real tools for surgical simulations and other specialized applications.
A novel type of haptic display is the Haptic Screen (Iwana, Yano, and Hashimoto, 1997), a device with a rubberized elastic surface with actuators, each with force sensors, underneath. The surface of the Haptic Screen can be deformed with the naked hand. An electromagnetic interface couples the ISU Force Reflecting Exoskeleton, developed at Iowa State University, to the operator’s two fingers, eliminating the burdensome heaviness usually associated with exoskeletal devices. Finally, there is considerable interest in 2D haptic devices. For example, Pai and Reissell at the University of British Columbia have used the Pantograph 2D haptic interface , a two-DF force-feedback planar device with a handle the user moves like a mouse, to feel the edges of shapes in images (Pai & Reissell, 1997).
At IMSC we are currently working with both the PHANToM and the CyberGrasp, using the Polhemus Fastrak for tracking the position of the CyberGrasp user’s hand. The tracking problem has been widely studied in the context of mobile robots at USC (Roumeliotis, Sukhatme, and Beckey, 1999a, 1999b). In the museum application the visitor and the museum staff member will be able to manipulate haptic data jointly, regardless of display type. Thus one of our primary concerns is insure proper registration of the disparate devices with the 3D environment and with each other. Of potential use in this regard is work by Iwata, Yano, and Hashimoto (1997) on LHX (Library for Haptics), a modular software that can support a variety of different haptic displays. LHX allows a variety of mechanical configurations, supports easy construction of haptic user interfaces, allows networked applications in virtual spaces, and includes a visual display interface. We are particularly eager to begin work with the CyberGrasp; to date we have been unable to identify any published work or conference papers reporting research using the device, which we attribute in part to its expense and relative infancy as a haptic display device.
Figure 2. Haptic acquisition and display devices Representative applications in haptic acquisition and display A primary application area for haptics has been in surgical simulation and medical training. Langrana, Burdea, Ladeiji, and Dinsmore (1997) used the Rutgers Master II haptic device in a training simulation for palpation of subsurface liver tumors. They modeled tumors as comparatively harder spheres within larger and softer spheres. Realistic reaction forces were returned to the user as the virtual hand encountered the tumors, and the graphical display showed corresponding tissue deformation produced by the palpation. Finite Element Analysis was used to compute reaction forces corresponding to deformation from experimentally obtained force/deflection curves. Andrew Mor of the Robotics Institute at Carnegie Mellon (Mor, 1998) has used the PHANToM in conjunction with a 2DOF planar device so that the new device would generate a moment measured about the tip of a surgical tool in an arthroscopic surgery simulation, thus providing a more realistic training for the kinds of unintentional contacts with ligaments and fibrous membranes that an inexperienced resident might encounter. At MIT, De and Srinivasan (1998) have developed models and algorithms for reducing the computational load required to generate visual rendering of organ motion and deformation and the communication of forces back to the user resulting from tool-tissue contact. They model soft tissue as thin-walled membranes filled with fluid.
Force-displacement response is comparable to that obtained in in vivo experiments. Giess, Evers, and Meinzer (1998) integrated haptic volume rendering with the PHANToM into the pre-surgical process of classifying liver parenchyma, vessel trees and tumors. Surgeons at the Pennsylvania State University School of Medicine in collaboration with Cambridge-based Boston Dynamics used two PHANToMs in a training simulation in which residents passed simulated needles through blood vessels, allowing them to collect baseline data on the surgical skill of new trainees. Iwata, Yano, and Hashimoto (1998) report the development of a surgical simulator with a free form tissue which behaves like real tissue, e.g., can be cut. Gruener (1998), in one of the few research reports which expresses reservations about the potential of haptics in medical applications, found that subjects in a telementoring session did not profit from the addition of force feedback to remote ultrasound diagnosis. There have been a few projects in which haptic displays are used as alternative input devices for painting, sculpting and computer-assisted design.
At CERTEC, the Center of Rehabilitation Engineering in Lund, S …