Haptography: Capturing and recreating the rich feel of real surfaces
Current Researchers: Heather Culbertson, Juanjo Lopez Delgado
Previous Researchers: Joe Romano, Pablo Castillo, Raven Hooper, Craig McDonald, Nick Pesta
Haptography, like photography in the visual domain, enables an individual to quickly record the haptic feel of a real object and reproduce it later for others to interact with in a variety of contexts. Particular positive ramifications of establishing the approach of haptography are to let doctors and dentists create haptic records of medical afflictions such as a decayed tooth surface to assist in diagnosis and patient health tracking; to improve the realism and consequent training efficacy of haptic surgical simulators and other computer-based education tools; to allow a wide range of people, such as museum goers and online shoppers, to touch realistic virtual copies of valuable items; to facilitate a haptographic approach to low-bandwidth and time-delayed teleoperation, as found in space exploration; and to enable new insights on human and robot touch capabilities.
The primary hypothesis of this research is that the feel of tool-mediated contact with real and virtual objects is directly governed by the high-frequency accelerations that occur during the interaction, as opposed to the low-frequency impedance of the contact. Building on our knowledge of the human haptic sensory system, our approach uses measurement-based mathematical modeling to derive perceptually relevant haptic surface models and dynamically robust haptic display paradigms, which have been tested via both experimental validation and human-subject studies.
The data is recorded using a Wacom tablet and Wacom stylus equipped with a three-axis accelerometer. The tablet measures the stylus's scanning speed and normal force during data collection. The textured surface is placed on top of the tablet and the stylus is dragged across the surface to record data (force, speed, and three axes of acceleration). The three axes of acceleration are combined to a single axis using the DFT321 method described in . This conversion if motivated by the fact that humans cannot discriminate the direction of high-frequency vibrations .
We represent each recorded texture vibration with an auto-regressive (AR) or auto-regressive moving-average (ARMA) model. We optimize this set of models using metrics that depend on spectral match, prediction error, and model order.
During rendering, the user's force and speed are measured. We then derive a new texture model at each time step using bilinear interpolation of the models adjacent to the user's current force and speed. We push white Gaussian noise through the models and use the output to drive a voice-coil actuator attached to the stylus, which in turn transmits the vibrations to the user's hand.
 N. Landin, J. M. Romano, W. McMahan, and K. J. Kuchenbecker. Dimensional reduction of high-frequency accelerations for haptic rendering. In Proceedings of the 2010 International Conference on Haptics, pages 79-86, 2010.
 J. Bell, S. Bolanowski, and M. Holmes. The structure and function of pacinian corpuscles: A review. Progress in Neurobiology, 42(1):79-128, 1994.
H. Culbertson, J. M. Romano, P. Castillo, M. Mintz, and K. J. Kuchenbecker. Refined methods for creating realistic haptic virtual textures from tool-mediated contact acceleration data. In Proc. IEEE Haptics Symposium, pages 385–391, March 2012.
K. J. Kuchenbecker, J. Romano, and W. McMahan. Haptography: Capturing and Recreating the Rich Feel of Real Surfaces (Invited Paper). In Proceedings, International Symposium on Robotics Research, August 2009.
W. McMahan and K. J. Kuchenbecker. Haptic Display of Realistic Tool Contact Via Dynamically Compensated Control of a Dedicated Actuator. In Proceedings, IEEE Intelligent RObots and Systems Conference, pages 3171-3177, October 2009.
W. McMahan and K. J. Kuchenbecker. Displaying Realistic Contact Accelerations Via a Dedicated Vibration Actuator. Hands-on demonstration presented at IEEE World Haptics Conference, March 2009.
J. M. Romano and K. J. Kuchenbecker. Creating Realistic Virtual Textures from Contact Acceleration Data. In IEEE Transactions on Haptics 5(2):109-119, 2012.
This material is based upon work supported by the National Science Foundation under Grant No. 0845670. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.