Human echolocation

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects, by actively creating sounds: for example, by tapping their canes, lightly stomping their foot, snapping their fingers, or making clicking noises with their mouths. People trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size.


The term "echolocation" was coined by zoologist Donald Griffin in 1944; however, reports of blind humans being able to locate silent objects date back to 1749.[1] Human echolocation has been known and formally studied since at least the 1950s.[2] In earlier times, human echolocation was sometimes described as "facial vision" or "obstacle sense," as it was believed that the proximity of nearby objects caused pressure changes on the skin.[3][4][5] Only in the 1940s did a series of experiments performed in the Cornell Psychological Laboratory show that sound and hearing, rather than pressure changes on the skin, were the mechanisms driving this ability.[1] The field of human and animal echolocation was surveyed in book form as early as 1959.[6] See also White, et al. (1970)[7]

Many blind individuals passively use natural environmental echoes to sense details about their environment; however, others actively produce mouth clicks and are able to gauge information about their environment using the echoes from those clicks.[8] Both passive and active echolocation help blind individuals sense their environments.

Those who can see their environments often do not readily perceive echoes from nearby objects, due to an echo suppression phenomenon brought on by the precedence effect. However, with training, sighted individuals with normal hearing can learn to avoid obstacles using only sound, showing that echolocation is a general human ability.[9]


Vision and hearing are akin in that each interprets detections of reflected waves of energy. Vision processes light waves that travel from their source, bounce off surfaces throughout the environment and enter the eyes. Similarly, the auditory system processes sound waves as they travel from their source, bounce off surfaces and enter the ears. Both neural systems can extract a great deal of information about the environment by interpreting the complex patterns of reflected energy that their sense organs receive. In the case of sound these waves of reflected energy are referred to as echoes.

Echoes and other sounds can convey spatial data that are comparable in many respects to those conveyed by light.[10] A blind traveler using echoes can perceive very complex, detailed, and specific features of the world from distances far beyond the reach of the longest cane or arm. Echoes make information available about the nature and arrangement of objects and environmental features such as overhangs, walls, doorways and recesses, poles, ascending curbs and steps, planter boxes, pedestrians, fire hydrants, parked or moving vehicles, trees and other foliage, and much more. Echoes can give detailed information about location (where objects are), dimension (how big they are and their general shape), and density (how solid they are). Location is generally broken down into distance from the observer and direction (left/right, front/back, high/low). Dimension refers to the object's height (tall or short) and breadth (wide or narrow).

By understanding the interrelationships of these qualities, much can be perceived about the nature of an object or multiple objects. For example, an object that is tall and narrow may be recognized quickly as a pole. An object that is tall and narrow near the bottom while broad near the top would be a tree. Something that is tall and very broad registers as a wall or building. Something that is broad and tall in the middle, while being shorter at either end may be identified as a parked car. An object that is low and broad may be a planter, retaining wall, or curb. And finally, something that starts out close and very low but recedes into the distance as it gets higher is a set of steps. Density refers to the solidity of the object (solid/sparse, hard/soft). Awareness of density adds richness and complexity to one's available information. For instance, an object that is low and solid may be recognized as a table, while something low and sparse sounds like a bush; but an object that is tall and broad and very sparse is probably a fence.[11]

Brain areas associated with echolocation[edit]

Echo-related activity in the brain of an early-blind, trained echolocator is shown on the left. There is no activity evident in the brain of a sighted person not so trained (shown on the right) listening to the same echoes

Some blind people are skilled at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes, for example, Ben Underwood. Although few studies have been performed on the neural basis of human echolocation, those studies report activation of primary visual cortex during echolocation in blind expert echolocators.[1][12][13] The driving mechanism of this brain region remapping phenomenon is known as neuroplasticity.

In a 2014 study by Thaler and colleagues,[14] the researchers first made recordings of the clicks and their very faint echoes using tiny microphones placed in the ears of the blind echolocators as they stood outside and tried to identify different objects such as a car, a flag pole, and a tree. The researchers then played the recorded sounds back to the echolocators while their brain activity was being measured using functional magnetic resonance imaging. Remarkably, when the echolocation recordings were played back to the blind experts, not only did they perceive the objects based on the echoes, but they also showed activity in those areas of their brain that normally process visual information in sighted people, primarily primary visual cortex or V1. This result is surprising, as visual areas, as their names suggest, are only active during visual tasks. The brain areas that process auditory information were no more activated by sound recordings of outdoor scenes containing echoes than they were by sound recordings of outdoor scenes with the echoes removed. Importantly, when the same experiment was carried out with sighted people who did not echolocate, these individuals could not perceive the objects and there was no echo-related activity anywhere in the brain. This suggests that the cortex of blind echolocators is plastic and reorganizes such that primary visual cortex, rather than any auditory area, becomes involved in the computation of echolocation tasks.

Despite this evidence, the extent to which activation in the visual cortex in blind echolocators contributes to echolocation abilities is unclear.[9] As previously mentioned, sighted individuals have the ability to echolocate; however, they do not show comparable activation in visual cortex. This would suggest that sighted individuals use areas beyond visual cortex for echolocation.

Notable cases of human echolocation[edit]

Daniel Kish[edit]

Echolocation has been further developed by Daniel Kish, who works with the blind through the non-profit organization World Access for the Blind.[15] He leads blind teenagers hiking and mountain-biking through the wilderness, and teaches them how to navigate new locations safely, with a technique that he calls "FlashSonar".[16] Kish had his eyes removed at the age of 13 months due to retinal cancer. He learned to make palatal clicks with his tongue when he was still a child—and now trains other blind people in the use of echolocation and in what he calls "Perceptual Mobility".[17] Though at first resistant to using a cane for mobility, seeing it as a "handicapped" device, and considering himself "not handicapped at all", Kish developed a technique using his white cane combined with echolocation to further expand his mobility.[17]

Kish reports that "The sense of imagery is very rich for an experienced user. One can get a sense of beauty or starkness or whatever—from sound as well as echo."[16] He is able to distinguish a metal fence from a wooden one by the information returned by the echoes on the arrangement of the fence structures; in extremely quiet conditions, he can also hear the warmer and duller quality of the echoes from wood compared to metal.[16]

Thomas Tajo[edit]

Thomas Tajo was born in the remote Himalayan village of Chayang Tajo in the state of Arunachal Pradesh in the north-east India and became blind around the age of 7 or 8 due to optic nerve atrophy. Tajo taught himself to echolocate. Today he lives in Belgium and works with Visioneers or World Access to impart independent navigational skills to blind individuals across the world. Tajo is also an independent researcher. He researches the cultural and biological evolutionary history of the senses and presents his findings to scientific conferences around the world.

Ben Underwood[edit]

Ben Underwood

Ben Underwood was a blind American who was born on January 26, 1992, in Riverside, California. He was diagnosed with retinal cancer at the age of two, and had his eyes removed at the age of three.[18]

He taught himself echolocation at the age of five, becoming able to detect the location of objects by making frequent clicking noises with his tongue. This case was explained in 20/20: Medical Mysteries.[19] He used it to accomplish such feats as running, playing basketball, riding a bicycle, rollerblading, playing football, and skateboarding.[20][21] He attended school at Edward Harris Jr. Middle School. Underwood's childhood eye doctor claimed that Underwood was one of the most proficient human echolocators. He died on January 19, 2009, a week before his 17th birthday, from retinal cancer, the same cancer that took his eyesight.

Tom De Witte[edit]

Tom De Witte was born in 1979 in Belgium with bilateral congenital glaucoma. It had seemed that he would become a successful flautist until he had to give up playing music in 2005. De Witte has been completely blind since 2009 due to additional problems with his eyes. He was taught echolocation by Daniel Kish and was given the nickname "Batman from Belgium" by the press.[citation needed][22]

Dr. Lawrence Scadden[edit]

Scadden has written of his experiences with blindness.[23] He was not born blind, but lost his sight due to illness. As a child, he learned to use echolocation well enough to ride a bicycle in traffic. (His parents thought that he still had some sight remaining.) He later participated in experiments in facial vision.[7] In about 1998, he visited the Auditory Neuroethology Laboratory at the University of Maryland and was interviewed about his experience with facial vision. The researchers in the lab study bat echolocation and were aware of the Wiederorientierung phenomenon described by Griffin (1959),[6] where bats, despite continuing to emit echolocation calls, use path integration in familiar acoustic space. Scadden indicated that he found echolocation required extra effort, and would not use it to navigate in familiar areas unless he was alert for obstacles, thus providing insight into the bat behavior.

The Regional Alliance of Science, Engineering and Mathematics for Students with Disabilities (RASEM) and the Science Education for Students With Disabilities (SESD), a Special Interest Group of the National Science Teachers Association (NSTA) have created the Lawrence A. Scadden Outstanding Teacher Award of the Year for Students With Disabilities in his honor.[citation needed]

Lucas Murray[edit]

Lucas Murray (born c. 2002), from Poole, Dorset, was born blind, but is one of the first British people to learn human echolocation. By the echo caused by clicking his tongue on the roof of his mouth, Murray can identify how close objects are, and what they are made of. He was taught the technique by Daniel Kish. Murray was born in Poole in Dorset with complex medical needs including septo-optic dysplasia. He was blind from birth but this was not confirmed until he was five months old. At this stage his parents, Sarah and Iain, believed his blindness would cause him problems.[24] However, Sarah and Iain watched a documentary about a young American called Ben Underwood,[25] a boy who used echo-location that he had taught himself to a very high level. In the documentary, Daniel Kish, founder of the World Access for the Blind charity,[26][27] spoke about not only echo-location but the importance of a Long Cane. Many months after seeing the documentary on television,[24] Sarah discovered that Daniel would be visiting a Scottish Charity called Visibility,[28] so contacted him and asked if he could visit Lucas.[29] Daniel Kish, a 41-year-old blind Californian, alongside Brian Bushway from World Access for the Blind, taught the basics of echo-location and the importance of a suitable Long Cane to Lucas over the course of four days in 2007.[30] Sarah says that the echo-location and "No Limits Approach" has given Lucas a "fantastic future". His parents have set up a charity called Common Sense, which aims to provide support for parents and carers of visually impaired children. They also offer long white canes to children in the UK, through The Common Sense Children's Cane Bank.

In 2019 he enjoyed a week's work experience with South Western Railway.[31]

Using echolocation[edit]

Lucas' mother Sarah said that, at seven years old, his independence was improving almost every day, and he could play with other children in sports such as rock climbing and basketball.[30][32] Lucas identifies the distance of objects by timing the time that the echo from him clicking takes to return, and from the sound which bounces back he can also tell the density and shape,[27][29] Lucas is the first British pupil to receive a comprehensive program in advanced echo-location.[30][33] similar to echolocation and the use of sonar to picture surroundings used by bats.[34] Kish is reported to have said that Lucas' "mobility is amazing", and that he is the "best for his age in the UK".[27] However, Kish wrote to Lucas' parents and other reporters, unpublished, that he actually said, "Lucas' mobility is among the best in the U.K. for his age in my experience." Lucas says that he really likes the system.[30] Lucas also uses a Long White cane to find objects near his feet. He uses an AmbuTech Telescopic Cane because it is light in weight and has a ceramic tip. At the proper length, it comes up to his nose when upright.[24]

Kevin Warwick[edit]

The scientist Kevin Warwick experimented with feeding ultrasonic pulses into the brain (via electrical stimulation from a neural implant) as an additional sensory input. In tests he was able to discern distance to objects accurately and to detect small movements of those objects.[35]

Juan Ruiz[edit]

Blind from birth, Juan Ruiz lives in Los Angeles, California. He appeared in the first episode of Stan Lee's Superhumans, titled "Electro Man". The episode showed him capable of riding a bicycle, avoiding parked cars and other obstacles, and identifying nearby objects. He entered and exited a cave, where he determined its length and other features.[citation needed]

In popular media[edit]

The 2017 video game Perception places the player in the role of a blind woman who must use echolocation to navigate the environment.[36]

In the 2012 film Imagine, the main character teaches echolocation to students at a clinic for the visually impaired. This unconventional method spurs a controversy but helps students explore the world.[37]

See also[edit]


  1. ^ a b c Kolarik, Andrew J.; Cirstea, Silvia; Pardhan, Shahina; Moore, Brian C. J. (2014-04-01). "A summary of research investigating echolocation abilities of blind and sighted humans". Hearing Research. 310: 60–68. doi:10.1016/j.heares.2014.01.010. PMID 24524865. S2CID 21785505.
  2. ^ Richard L. Welsh, Bruce B. Blasch, online Foundations of Orientation and Mobility, American Foundation for the Blind, 1997; which cites S. O. Myers and C. G. E. G. Jones, "Obstable experiments: second report", Teacher for the Blind 46, 47–62, 1958.
  3. ^ Raymond J Corsini, The Dictionary of Psychology, Psychology Press (UK), 1999, ISBN 1-58391-028-X.
  4. ^ M. Supa, M. Cotzin, and K. M. Dallenbach. "Facial Vision" - The Perception of Obstacles by the Blind. The American Journal of Psychology, April 1944.
  5. ^ Cotzin and Dallenbach. "Facial Vision": The Role of Pitch and Loudness in the Location of Obstacles by the Blind. The American Journal of Psychology, October 1950.
  6. ^ a b Griffin, Donald R., Echos of Bats and Men, Anchor Press, 1959 (Science and Study Series, Seeing With Sound Waves)
  7. ^ a b White, J. C., Saunders, F. A., Scadden, L., Bach-y-Rita, P., & Collins, C. C. (1970). Seeing with the skin. Perception & Psychophysics, 7, 23-27.
  8. ^ Thaler, Lore (2015-11-25). "Using Sound to Get Around - Association for Psychological Science". Aps Observer. 28 (10). Retrieved 2016-04-22.
  9. ^ a b Wallmeier, Ludwig; Geßele, Nikodemus; Wiegrebe, Lutz (2013-10-22). "Echolocation versus echo suppression in humans". Proceedings of the Royal Society of London B: Biological Sciences. 280 (1769): 20131428. doi:10.1098/rspb.2013.1428. ISSN 0962-8452. PMC 3768302. PMID 23986105.
  10. ^ Rosenblum LD, Gordon MS, Jarquin L (2000). "Echolocating distance by moving and stationary listeners". Ecol. Psychol. 12 (3): 181–206. CiteSeerX doi:10.1207/S15326969ECO1203_1. S2CID 30936808.
  11. ^ Kish D. (1982). Evaluation of an echo-mobility training program for young blind people: Master's Thesis, University of Southern California (Thesis).
  12. ^ Thaler L, Arnott SR, Goodale MA (2011). "Neural correlates of natural human echolocation in early and late blind echolocation experts". PLOS ONE. 6 (5): e20162. Bibcode:2011PLoSO...620162T. doi:10.1371/journal.pone.0020162. PMC 3102086. PMID 21633496.
  13. ^ Bat Man, Reader's Digest, June 2012, retrieved March 14, 2014
  14. ^ Thaler, L., Milne, J. L., Arnott, S. R., Kish, D., & Goodale, M. A. (2014). Neural correlates of motion processing through echolocation, source hearing, and vision in blind echolocation experts and sighted echolocation novices. Journal of Neurophysiology, 111(1), 112-127.
  15. ^ "World Access Online".
  16. ^ a b c Kremer, William (12 September 2012). "Human echolocation: Using tongue-clicks to navigate the world". BBC. Retrieved September 12, 2012.
  17. ^ a b Kish, Daniel (1995), Evaluation of an Echo-Mobility Program for Young Blind People, Master's thesis, San Bernardino, CA: Department of Psychology, California State University, p. 277, archived from the original on February 2, 2002
  18. ^ morgan isdell aleks petcova even white thomas cian Humans With Amazing Senses — ABC News.
  19. ^ Moorhead, Joanna (January 27, 2007). "Seeing with sound". The Guardian. London.
  20. ^ "How A Blind Teen 'Sees' With Sound". CBS News. July 19, 2006.
  21. ^ The Boy Who Sees with Sound — People Magazine
  22. ^ J Vandermosten. "Straf verhaal: Blinde ziet door goed te horen". Gazet van Antwerpen.
  23. ^ Surpassing Expectations: Life Without Sight Scadden, Lawrence]
  24. ^ a b c "Lucas learns echo technique to 'see'". Daily Echo. 7 October 2009. Retrieved 8 October 2009.
  25. ^ "Ben Underwood | Blind Boy Who Could See".
  26. ^ "".
  27. ^ a b c "Blind boy uses his ears to 'see'". BBC News. 5 October 2009. Retrieved 8 October 2009.
  28. ^ "Visibility Scotland - Listening and responding to people affected by sight loss across Scotland". Visibility Scotland.
  29. ^ a b "'Batboy' Lucas sees with his ears". The Press Association. Retrieved 8 October 2009.
  30. ^ a b c d Irvine, Chris (5 October 2009). "Seven year old blind boy uses echoes to see". The Daily Telegraph. London. Retrieved 8 October 2009.
  31. ^ Cartlidge, Sarah (19 May 2019). ""The best work experience ever": Blind teenager enjoys "phenomenal" placement". Bournemouth Echo. Retrieved 28 May 2020.
  32. ^ "Action for Blind People merged with RNIB". RNIB - See differently. 23 March 2017.
  33. ^ "Blind 7-Year-Old Boy Sees With His Ears". ABC News.
  34. ^ "Blind seven-year-old boy learns to "see" using his ears". Daily Mirror. 7 October 2009. Retrieved 8 October 2009.
  35. ^ Warwick K, Hutt B, Gasson M, and Goodhew I. "An attempt to extend human sensory capabilities by means of implant technology". Proc. IEEE International Conference on Systems, Man. and Cybernetics - Hawaii October 2005. pp.1663-1668
  36. ^ Skrebels, Joe (May 25, 2017). "Perception Review". IGN. Retrieved May 25, 2017.
  37. ^ Grierson, Tim. "Imagine". ScreenDaily. Retrieved June 21, 2017.

External links[edit]