Discrimination of object information by bat echolocation deciphered from acoustic simulations.
Yu TeshimaMayuko MogiHare NishidaTakao TsuchiyaKohta I KobayasiShizuko HiryuPublished in: Royal Society open science (2024)
High-precision visual sensing has been achieved by combining cameras with deep learning. However, an unresolved challenge involves identifying information that remains elusive for optical sensors, such as occlusion spots hidden behind objects. Compared to light, sound waves have longer wavelengths and can, therefore, collect information on occlusion spots. In this study, we investigated whether bats could perform advanced sound sensing using echolocation to acquire a target's occlusion information. We conducted a two-alternative forced choice test on Pipistrellus abramus with five different targets, including targets with high visual similarity from the front, but different backend geometries, i.e. occlusion spots or textures. Subsequently, the echo impulse responses produced by these targets, which were difficult to obtain with real measurements, were computed using three-dimensional acoustic simulations to provide a detailed analysis consisting of the acoustic cues that the bats obtained through echolocation. Our findings demonstrated that bats could effectively discern differences in target occlusion spot structure and texture through echolocation. Furthermore, the discrimination performance was related to the differences in the logarithmic spectral distortion of the occlusion-related components in the simulated echo impulse responses. This suggested that the bats obtained occlusion information through echolocation, highlighting the advantages of utilizing broadband ultrasound for sensing.