Both vision and touch play important roles in human perception of real surfaces. Judging material properties based on only one modality may not give reliable results. For example, many of us have had the experience that something we ordered online did not match with our expectations when it arrived in the mail. On the other side, we are often confused about the objects that we touch when trying to navigate an unfamiliar place in the dark. These undesirable situations occur because we are being forced to make perceptual decisions without information from all of our relevant senses.
Although there have been many studies characterizing the information gathered by the individual senses, it is still unclear how each modality’s information is processed and integrated. In this research, we aim to answer these questions for vision and touch. Will humans perceive surfaces similarly or differently through these two senses? And what visual and haptic properties correlate most strongly with human perception?
This project is currently focused on the 100 isotropic and homogeneous surfaces that make up the Penn Haptic Texture Toolkit. We are conducting psychophysical and physical experiments with a subset of these materials to understand the roles that visual and haptic cues each play in the perception of real surfaces.
Keywords: surface, visual perception, haptic perception, fingertip, human
People: Dr. Yasemin Vardar (Max Planck Institute for Intelligent Systems), Dr. Christian Wallraven (Korea University), Dr. Katherine J. Kuchenbecker (Max Planck Institute for Intelligent Systems)