Human echolocation is an ability of humans to detect objects in their environment by sensing echoes from those objects. By actively creating sounds – for example, by tapping their canes, lightly stomping their foot or making clicking noises with their mouths – people trained to orientate with echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size. This ability is used by some blind people for acoustic wayfinding, or navigating within their environment using auditory rather than visual cues. It is similar in principle to active sonar and to the animal echolocation employed by some animals, including bats, dolphins and toothed whales.
Human echolocation has been known and formally studied since at least the 1950s. In earlier times, human echolocation was sometimes described as "facial vision". The field of human and animal echolocation was surveyed in book form as early as 1959. See also White, et al., (1970)
Vision and hearing are closely related in that they can process reflected waves of energy. Vision processes light waves as they travel from their source, bounce off surfaces throughout the environment and enter the eyes. Similarly, the auditory system processes sound waves as they travel from their source, bounce off surfaces and enter the ears. Both systems can extract a great deal of information about the environment by interpreting the complex patterns of reflected energy that they receive. In the case of sound, these waves of reflected energy are called "echoes".
Echoes and other sounds can convey spatial information that is comparable in many respects to that conveyed by light. With echoes, a blind traveler can perceive very complex, detailed, and specific information from distances far beyond the reach of the longest cane or arm. Echoes make information available about the nature and arrangement of objects and environmental features such as overhangs, walls, doorways and recesses, poles, ascending curbs and steps, planter boxes, pedestrians, fire hydrants, parked or moving vehicles, trees and other foliage, and much more. Echoes can give detailed information about location (where objects are), dimension (how big they are and their general shape), and density (how solid they are). Location is generally broken down into distance from the observer and direction (left/right, front/back, high/low). Dimension refers to the object's height (tall or short) and breadth (wide or narrow).
By understanding the interrelationships of these qualities, much can be perceived about the nature of an object or multiple objects. For example, an object that is tall and narrow may be recognized quickly as a pole. An object that is tall and narrow near the bottom while broad near the top would be a tree. Something that is tall and very broad registers as a wall or building. Something that is broad and tall in the middle, while being shorter at either end may be identified as a parked car. An object that is low and broad may be a planter, retaining wall, or curb. And finally, something that starts out close and very low but recedes into the distance as it gets higher is a set of steps. Density refers to the solidity of the object (solid/sparse, hard/soft). Awareness of density adds richness and complexity to one's available information. For instance, an object that is low and solid may be recognized as a table, while something low and sparse sounds like a bush; but an object that is tall and broad and very sparse is probably a fence.
Neural substrates of echolocation in the blind
Some blind people are skilled at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes. It has been recently shown that blind echolocation experts use what is normally the ‘visual’ part of their brain to process the echoes. The driving mechanism of this brain region remapping phenomenon is known as neuroplasticity.The researchers first made recordings of the clicks and their very faint echoes using tiny microphones placed in the ears of the blind echolocators as they stood outside and tried to identify different objects such as a car, a flag pole, and a tree. The researchers then played the recorded sounds back to the echolocators while their brain activity was being measured using functional magnetic resonance imaging. Remarkably, when the echolocation recordings were played back to the blind experts, not only did they perceive the objects based on the echoes, but they also showed activity in those areas of their brain that normally process visual information in sighted people, primarily primary visual cortex or V1. Most interestingly, the brain areas that process auditory information were no more activated by sound recordings of outdoor scenes containing echoes than they were by sound recordings of outdoor scenes with the echoes removed. Importantly, when the same experiment was carried out with sighted people who did not echolocate, these individuals could not perceive the objects and there was no echo-related activity anywhere in the brain.
Notable individuals who employ echolocation
Echolocation has been further developed by Daniel Kish, who works with the blind, leading blind teenagers hiking and mountain-biking through the wilderness and teaching them how to navigate new locations safely, with a technique that he calls "FlashSonar", through the non-profit organization World Access for the Blind. Kish had his eyes removed at the age of 13 months due to retinal cancer. He learned to make palatal clicks with his tongue when he was still a child—and now trains other blind people in the use of echolocation and in what he calls "Perceptual Mobility". Though at first resistant to using a cane for mobility, seeing it as a "handicapped" device, and considering himself "not handicapped at all", Kish developed a technique using his white cane combined with echolocation to further expand his mobility.
Kish reports that "The sense of imagery is very rich for an experienced user. One can get a sense of beauty or starkness or whatever - from sound as well as echo". He is able to distinguish a metal fence from a wooden one by the information returned by the echoes on the arrangement of the fence structures; in extremely quiet conditions, he can also hear the warmer and duller quality of the echoes from wood compared to metal.
Diagnosed with retinal cancer at the age of two, American Ben Underwood had his eyes removed at the age of three.
He discovered echolocation at the age of five. He was able to detect the location of objects by making frequent clicking noises with his tongue. This case was explained in 20/20:medical mysteries. He used it to accomplish such feats as running, playing basketball, riding a bicycle, rollerblading, playing football, and skateboarding. Underwood's child eye doctor claimed that Underwood was one of the most proficient human echolocators.
Underwood died on January 19, 2009 at the age of 16, from the same cancer that took his vision.
The Polish film director Andrzej Jakimowski met Ben and was inspired by him. In 2012 he released his film "Imagine" (http://www.youtube.com/watch?v=PspROQgS9MI) about Ian, a spatial orientation instructor, arrives at a world-renowned Lisbon clinic for the visually impaired to work with blind patients. The doctor in charge of the clinic hires Ian on condition that the patients won’t be exposed to danger as they learn to move around by themselves using echo location methods he teaches.
Tom De Witte
Tom De Witte was born in 1979 in Belgium with bilateral congenital glaucoma in both eyes. It had seemed that De Witte would become a successful flautist until he had to give up playing music in 2005. De Witte has been completely blind since 2009 due to additional problems with his eyes. He was taught echolocation by Daniel Kish and was given the nickname "Batman from Belgium" by the press.
Dr. Lawrence Scadden
Dr. Scadden has written of his experiences with blindness. He was not born blind, but lost his sight due to illness. As a child, he learned to use echolocation well enough to ride a bicycle in traffic. (His parents thought that he still had some sight remaining.) He later participated in experiments in facial vision (White, et al. 1970). About 1998, he visited the Auditory Neuroethology Laboratory at the University of Maryland and was interviewed about his experience with facial vision. The researchers in the lab study bat echolocation and were aware of the Wiederorientierung phenomenon described by Griffin (1959), where bats, despite continuing to emit echolocation calls, use dead reckoning in familiar acoustic space. Dr. Scadden indicated that he found echolocation effortful, and would not use it to navigate in familiar areas unless he were alert for obstacles, thus providing insight into the bat behavior.
The Regional Alliance of Science, Engineering and Mathematics for Students with Disabilities (RASEM) and the Science Education for Students With Disabilities (SESD), a Special Interest Group of the National Science Teachers Association (NSTA) have created the Lawrence A. Scadden Outstanding Teacher Award of the Year for Students With Disabilities in his honor.
Lucas Murray, from Poole, Dorset, was born blind. He is believed to be one of the first British people to learn to visualise his surroundings using echolocation, and was taught by Daniel Kish.
The scientist Kevin Warwick experimented with feeding ultrasonic pulses into the brain (via electrical stimulation from a neural implant) as an additional sensory input. In tests he was able to accurately discern distance to objects and to detect small movements of those objects.
Juan Ruiz appeared in the first episode of Stan Lee's Superhumans titled "Electro Man". He lives at Los Angeles, California and was blind since birth. In the episode, he was shown to be capable of riding a bicycle, avoid parked cars and other obstacles and determine nearby objects. He was also able to go inside and out of a cave, where he determined its length and other features.
- Animal echolocation
- Acoustic location
- Sensory substitution
- Thaandavam, a Tamil film involving human echolocation
- Richard L. Welsh, Bruce B. Blasch, online Foundations of Orientation and Mobility, American Foundation for the Blind, 1997; which cites S. O. Myers and C. G. E. G. Jones, "Obstable experiments: second report", Teacher for the Blind 46, 47–62, 1958.
- Raymond J Corsini, The Dictionary of Psychology, Psychology Press (UK), 1999, ISBN 1-58391-028-X.
- M. Supa, M. Cotzin, and K. M. Dallenbach. "Facial Vision" - The Perception of Obstacles by the Blind. The American Journal of Psychology, April 1944.
- Cotzin and Dallenbach. "Facial Vision": The Role of Pitch and Loudness in the Location of Obstacles by the Blind. The American Journal of Psychology, October 1950.
- Griffin, Donald R., Echos of Bats and Men, Anchor Press, 1959 (Science and Study Series, Seeing With Sound Waves)
- White, J. C., Saunders, F. A., Scadden, L., Bach-y-Rita, P., & Collins, C. C. (1970). Seeing with the skin. Perception & Psychophysics, 7, 23-27.
- Rosenblum LD, Gordon MS, Jarquin L. (2000). "Echolocating distance by moving and stationary listeners.". Ecol. Psychol. 12 (3): 181–206. doi:10.1207/S15326969ECO1203_1.
- Kish D. (1982). Evaluation of an echo-mobility training program for young blind people: Master's Thesis, University of Southern California (Thesis).
- Thaler L, Arnott SR, Goodale MA. (2011). "Neural correlates of natural human echolocation in early and late blind echolocation experts.". PLoS ONE 6 (5): e20162. Bibcode:2011PLoSO...6E0162T. doi:10.1371/journal.pone.0020162. PMID 21633496.
- Bat Man, Reader's Digest Australia, 2012 copyright by, p. 192, ISBN 9-311484-01874 Check
- Kremer, William (12 September 2012). "Human echolocation: Using tongue-clicks to navigate the world". BBC. Retrieved September 12, 2012.
- World Access for the Blind
- [dead link]
- Humans With Amazing Senses — ABC News.
- Moorhead, Joanna (January 27, 2007). "Seeing with sound". The Guardian (London).
- "How A Blind Teen 'Sees' With Sound". CBS News. July 19, 2006.
- The Boy Who Sees with Sound — People magazine
-  Ben Underwood Website
- Scadden, Lawrence
- K.Warwick, B.Hutt, M.Gasson and I.Goodhew,“An attempt to extend human sensory capabilities by means of implant technology”, Proc. IEEE International Conference on Systems, Man and Cybernetics, Hawaii, pp.1663-1668, October 2005