Vision and hearing are generally regarded as two very different senses. Unless, of course, you can echolocate. Now, scientists have revealed for the first time that human echolocators — blind individuals who navigate their surroundings by producing mouth clicks and listening to the returning echoes — actually process these sounds in the regions of the brain dedicated to interpreting visual stimuli.
Here is a video about how it works:
Here is a link to the study:
Neural Correlates of Natural Human Echolocation in Early and Late Blind Echolocation Experts
A small number of blind people are adept at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes. Yet the neural architecture underlying this type of aid-free human echolocation has not been investigated. To tackle this question, we recruited echolocation experts, one early- and one late-blind, and measured functional brain activity in each of them while they listened to their own echolocation sounds.
When we compared brain activity for sounds that contained both clicks and the returning echoes with brain activity for control sounds that did not contain the echoes, but were otherwise acoustically matched, we found activity in calcarine cortex in both individuals. Importantly, for the same comparison, we did not observe a difference in activity in auditory cortex. In the early-blind, but not the late-blind participant, we also found that the calcarine activity was greater for echoes reflected from surfaces located in contralateral space. Finally, in both individuals, we found activation in middle temporal and nearby cortical regions when they listened to echoes reflected from moving targets.
These findings suggest that processing of click-echoes recruits brain regions typically devoted to vision rather than audition in both early and late blind echolocation experts.