In this paper we present an experiment whose goal is to investigate subjects’ ability to match pairs of synthetic auditory and haptic stimuli which simulate the sensation of walking on different surfaces. In three non-interactive conditions the audio–haptic stimuli were passively presented through a desktop system, while in three interactive conditions participants produced the audio–haptic feedback interactively while walking. Results show that material typology (i.e., solid or aggregate) is processed very consistently in both the auditory and haptic modalities. Subjects expressed a higher level of semantic congruence for those audio–haptic pairs of materials which belonged to the same typology. Furthermore, better matching ability was found for the passive case compared to the interactive one, although this may be due to the limits of the technology used for the interactive haptic simulations.
- Semantic congruence