From Wikipedia, the free encyclopedia

An earcon is a brief, distinctive sound that represents a specific event or conveys other information. Earcons are a common feature of computer operating systems and applications, ranging from a simple beep to indicate an error, to the customizable sound schemes of modern operating systems that indicate startup, shutdown, and other events.[1]

The name is a pun on the more familiar term icon in computer interfaces. Icon sounds like "eye-con" and is visual, which inspired D.A. Sumikawa to coin "earcon" as the auditory equivalent in a 1985 article, 'Guidelines for the integration of audio cues into computer user interfaces.'[2]

The term is most commonly applied to sound cues in a computer interface, but examples of the concept occur in broadcast media such as radio and television:

  • The alert signal that indicates a message from the Emergency Broadcast System
  • The signature three-tone melody that identifies NBC in radio and television broadcasts

Earcons are generally synthesized tones or sound patterns. The similar term auditory icon refers to recorded everyday sounds that serve the same purpose.

Use in assistive technologies[edit]

Assistive technologies for computing devices—such as screen readers including ChromeOS's ChromeVox, Android's TalkBack and Apple's VoiceOver—use earcons as a convenient and fast means of conveying to blind or visually impaired users contextual information about the interface they are navigating. Earcons in screen readers largely serve as auditory cues to inform the user that they have selected a particular type of interface element, such as a button, hyperlink or text input field.[3][4] They can also provide context about the current document or mode, such as whether a web page is loading.

Earcons provide an enhancement to screen reader usage due to their brevity and subtleness, which is an improvement over using much longer spoken cues to provide context: using a short, distinctive beep when an interface's button is selected can be much faster and therefore more convenient to hear than using speech synthesis to say the word "button".[5]

Due to being non-spoken audio sounds, users must learn to associate the earcons with their meanings to be able to fully benefit from them. To help with learning such associations, some screen readers will also speak the meanings of their respective earcons, albeit towards the end of their full description of an interface element. It is recommended that earcons be introduced early on when learning how to use a screen reader to ensure that they become impulsively (and eventually, subconsciously) associated through habitual usage.[4]

See also[edit]


  1. ^ Thurrott, Paul (2009-03-08). "Paul Thurrott's SuperSite for Windows: Windows 7 Build 7048 Notes". Paul Thurrott's SuperSite for Windows. Archived from the original on 2009-04-13. Retrieved 2009-04-24.
  2. ^ Sumikawa, D.A. (1985). "Guidelines for the integration of audio cues into computer user interfaces". OSTI 5475406. {{cite journal}}: Cite journal requires |journal= (help)
  3. ^ Steele, Billy (10 February 2017). "Google makes its screen reader easier to use on Chromebooks". Engadget. Archived from the original on 2021-03-05. Retrieved 2023-01-28.
  4. ^ a b "iCons and Earcons: Critical but often overlooked tech skills". Perkins School for the Blind. Archived from the original on 2022-10-02. Retrieved 2023-01-28.
  5. ^ Dorigo, Martin Lukas; Harriehausen-Mühlbauer, Bettina; Stengel, Ingo; Dowland, Paul (2014). "Nonvisual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices". Lecture Notes in Computer Science. Springer, Cham. 8547: 383–390. doi:10.1007/978-3-319-08596-8_59. ISBN 978-3-319-08595-1. Retrieved 2023-01-28 – via Springer Link.