Haptics : Research - Haptography browse

Haptography: Capturing and recreating the rich feel of real surfaces


Current Researchers: Heather Culbertson, Juan Jose Lopez Delgado

Previous Researchers: Joe Romano, Craig McDonald, Pablo Castillo, Raven Hooper, Nick Pesta

Overview

Haptography, like photography in the visual domain, enables an individual to quickly record the haptic feel of a real object and reproduce it later for others to interact with in a variety of contexts. Particular positive ramifications of establishing the approach of haptography are to let doctors and dentists create haptic records of medical afflictions such as a decayed tooth surface to assist in diagnosis and patient health tracking; to improve the realism and consequent training efficacy of haptic surgical simulators and other computer-based education tools; to allow a wide range of people, such as museum goers and online shoppers, to touch realistic virtual copies of valuable items; to facilitate a haptographic approach to low-bandwidth and time-delayed teleoperation, as found in space exploration; and to enable new insights on human and robot touch capabilities.

The primary hypothesis of this research is that the feel of tool-mediated contact with real and virtual objects is directly governed by the high-frequency accelerations that occur during the interaction, as opposed to the low-frequency impedance of the contact. Building on our knowledge of the human haptic sensory system, our approach uses measurement-based mathematical modeling to derive perceptually relevant haptic surface models and dynamically robust haptic display paradigms, which have been tested via both experimental validation and human-subject studies.

Current System

The data is recorded using the custom recording device shown below. This device includes sensors to measure three axes of force, position, and orientation, and high-frequency acceleration while it is dragged across a textured surface.

The user holds the recording device and explores the object in a natural manner without constraints on the normal force and scanning speed. The three axes of acceleration are combined to a single axis using the DFT321 method described in [1]. This conversion if motivated by the fact that humans cannot discriminate the direction of high-frequency vibrations [2].

Since the normal force and scanning speed are allowed to vary during data capture, the behavior of the recorded acceleration signal is not consistent across the entire recording and its power and frequency content evolves over time. Therefore, we cannot represent the signal with a single model. Instead, we represent each recorded texture acceleration signal as a piecewise auto-regressive (AR) process by breaking the acceleration signal down into approximately stationary segments and creating an AR model for each segment. An example of the segmentation process is shown below. Force and speed data were not used in segmenting the acceleration signal, but are shown to illustrate the dependency of the resulting accelerations on the interaction conditions.

Each segment's AR model is labeled with the median force and median speed used during that portion of the recording. These force and speed labels are used to store the models in a Delaunay triangulation.

During rendering, we derive a new texture model at each time step using measurements of the user's force and speed. We then determine which triangle contains the user's current force-speed point and interpolate between the three models at the vertices of that triangle using Barycentric coordinates. We push white Gaussian noise through the interpolated model and use the output to drive a voice-coil actuator attached to the stylus, which in turn transmits the vibrations to the user's hand.



References

[1] N. Landin, J. M. Romano, W. McMahan, and K. J. Kuchenbecker. Dimensional reduction of high-frequency accelerations for haptic rendering. In Proceedings of the 2010 International Conference on Haptics, pages 79-86, 2010.

[2] J. Bell, S. Bolanowski, and M. Holmes. The structure and function of pacinian corpuscles: A review. Progress in Neurobiology, 42(1):79-128, 1994.


Publications

H. Culbertson, J. J. Lopez Delgado, and K. J. Kuchenbecker. One hundred data-driven haptic texture models and open-source methods for rendering on 3d objects. In Proc. IEEE Haptics Symposium, February 2014.

H. Culbertson, J. Unwin, B. E. Goodman, and K. J. Kuchenbecker. Generating haptic texture models from unconstrained tool-surface interactions. In Proc. IEEE World Haptics Conference, pages 295-300, April 2013.

H. Culbertson, J. M. Romano, P. Castillo, M. Mintz, and K. J. Kuchenbecker. Refined methods for creating realistic haptic virtual textures from tool-mediated contact acceleration data. In Proc. IEEE Haptics Symposium, pages 385391, March 2012.

K. J. Kuchenbecker, J. Romano, and W. McMahan. Haptography: Capturing and Recreating the Rich Feel of Real Surfaces (Invited Paper). In Proceedings, International Symposium on Robotics Research, August 2009.

W. McMahan and K. J. Kuchenbecker. Haptic Display of Realistic Tool Contact Via Dynamically Compensated Control of a Dedicated Actuator. In Proceedings, IEEE Intelligent RObots and Systems Conference, pages 3171-3177, October 2009.

W. McMahan and K. J. Kuchenbecker. Displaying Realistic Contact Accelerations Via a Dedicated Vibration Actuator. Hands-on demonstration presented at IEEE World Haptics Conference, March 2009.

J. M. Romano and K. J. Kuchenbecker. Creating Realistic Virtual Textures from Contact Acceleration Data. In IEEE Transactions on Haptics 5(2):109-119, 2012.


This material is based upon work supported by the National Science Foundation under Grant No. 0845670. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.