language-icon Old Web
English
Sign In

Human echolocation

Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects, by actively creating sounds: for example, by tapping their canes, lightly stomping their foot, snapping their fingers, or making clicking noises with their mouths. People trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size. Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects, by actively creating sounds: for example, by tapping their canes, lightly stomping their foot, snapping their fingers, or making clicking noises with their mouths. People trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size. The term 'echolocation' was coined by zoologist Donald Griffin in 1944; however, reports of blind humans being able to locate silent objects date back to 1749. Human echolocation has been known and formally studied since at least the 1950s. In earlier times, human echolocation was sometimes described as 'facial vision' or 'obstacle sense,' as it was believed that the proximity of nearby objects caused pressure changes on the skin. Only in the 1940s did a series of experiments performed in the Cornell Psychological Laboratory show that sound and hearing, rather than pressure changes on the skin, were the mechanisms driving this ability. The field of human and animal echolocation was surveyed in book form as early as 1959. See also White, et al. (1970) Many blind individuals passively use natural environmental echoes to sense details about their environment; however, others actively produce mouth clicks and are able to gauge information about their environment using the echoes from those clicks. Both passive and active echolocation help blind individuals learn about their environments. Because sighted individuals learn about their environments using vision, they often do not readily perceive echoes from nearby objects. This is due to an echo suppression phenomenon brought on by the precedence effect. However, with training, sighted individuals with normal hearing can learn to avoid obstacles using only sound, showing that echolocation is a general human ability. Vision and hearing are closely related in that they can process reflected waves of energy. Vision processes light waves as they travel from their source, bounce off surfaces throughout the environment and enter the eyes. Similarly, the auditory system processes sound waves as they travel from their source, bounce off surfaces and enter the ears. Both systems can extract a great deal of information about the environment by interpreting the complex patterns of reflected energy that they receive. In the case of sound, these waves of reflected energy are called 'echoes'. Echoes and other sounds can convey spatial information that is comparable in many respects to that conveyed by light. With echoes, a blind traveler can perceive very complex, detailed, and specific information from distances far beyond the reach of the longest cane or arm. Echoes make information available about the nature and arrangement of objects and environmental features such as overhangs, walls, doorways and recesses, poles, ascending curbs and steps, planter boxes, pedestrians, fire hydrants, parked or moving vehicles, trees and other foliage, and much more. Echoes can give detailed information about location (where objects are), dimension (how big they are and their general shape), and density (how solid they are). Location is generally broken down into distance from the observer and direction (left/right, front/back, high/low). Dimension refers to the object's height (tall or short) and breadth (wide or narrow). By understanding the interrelationships of these qualities, much can be perceived about the nature of an object or multiple objects. For example, an object that is tall and narrow may be recognized quickly as a pole. An object that is tall and narrow near the bottom while broad near the top would be a tree. Something that is tall and very broad registers as a wall or building. Something that is broad and tall in the middle, while being shorter at either end may be identified as a parked car. An object that is low and broad may be a planter, retaining wall, or curb. And finally, something that starts out close and very low but recedes into the distance as it gets higher is a set of steps. Density refers to the solidity of the object (solid/sparse, hard/soft). Awareness of density adds richness and complexity to one's available information. For instance, an object that is low and solid may be recognized as a table, while something low and sparse sounds like a bush; but an object that is tall and broad and very sparse is probably a fence. Some blind people are skilled at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes, for example Ben Underwood. Although few studies have been performed on the neural basis of human echolocation, those studies report activation of primary visual cortex during echolocation in blind expert echolocators. The driving mechanism of this brain region remapping phenomenon is known as neuroplasticity. In a 2014 study by Thaler and colleagues, the researchers first made recordings of the clicks and their very faint echoes using tiny microphones placed in the ears of the blind echolocators as they stood outside and tried to identify different objects such as a car, a flag pole, and a tree. The researchers then played the recorded sounds back to the echolocators while their brain activity was being measured using functional magnetic resonance imaging. Remarkably, when the echolocation recordings were played back to the blind experts, not only did they perceive the objects based on the echoes, but they also showed activity in those areas of their brain that normally process visual information in sighted people, primarily primary visual cortex or V1. This result is surprising, as visual areas, as their names suggest, are only active during visual tasks. The brain areas that process auditory information were no more activated by sound recordings of outdoor scenes containing echoes than they were by sound recordings of outdoor scenes with the echoes removed. Importantly, when the same experiment was carried out with sighted people who did not echolocate, these individuals could not perceive the objects and there was no echo-related activity anywhere in the brain. This suggests that the cortex of blind echolocators is plastic and reorganizes such that primary visual cortex, rather than any auditory area, becomes involved in the computation of echolocation tasks.

[ "Ecology", "Acoustics", "Zoology", "Neuroscience", "Echolocation jamming", "Promops", "Myotis adversus", "Megaderma lyra", "Pteronotus" ]
Parent Topic
Child Topic
    No Parent Topic