Tool-sensed object information effectively supports vision for multisensory grasping.
Ivan CamponogaraAlessandro FarnèRobert VolcicPublished in: Journal of experimental psychology. General (2024)
Tools enable humans to extend their sensing abilities beyond the natural limits of their hands, allowing them to sense objects as if they were using their hands directly. The similarities between direct hand interactions with objects (hand-based sensing) and the ability to extend sensory information processing beyond the hand (tool-mediated sensing) entail the existence of comparable processes for integrating tool- and hand-sensed information with vision, raising the question of whether tools support vision in bimanual object manipulations. Here, we investigated participants' performance while grasping objects either held with a tool or with their hand and compared these conditions with visually guided grasping (Experiment 1). By measuring reaction time, peak velocity, and peak of grip aperture, we found that actions were initiated earlier and performed with a smaller peak grip aperture when the object was seen and held with the tool or the contralateral hand compared to when it was only seen. Thus, tool-mediated sensing effectively supports vision in multisensory grasping and, even more intriguingly, resembles hand-based sensing. We excluded that results were due to the force exerted on the tool's handle (Experiment 2). Additionally, as for hand-based sensing, we found evidence that the tool supports vision by mainly providing object positional information (Experiment 3). Thus, integrating the tool-sensed position of the object with vision is sufficient to promote a multisensory advantage in grasping. Our findings indicate that multisensory integration mechanisms significantly improve grasping actions and fine-tune contralateral hand movements even when object information is only indirectly sensed through a tool. (PsycInfo Database Record (c) 2024 APA, all rights reserved).