|Applied and experimental|
Phonetics (pronounced //, from the Greek: φωνή, phōnē, 'sound, voice') is a branch of linguistics that comprises the study of the sounds of human speech, or—in the case of sign languages—the equivalent aspects of sign. It is concerned with the physical properties of speech sounds or signs (phones): their physiological production, acoustic properties, auditory perception, and neurophysiological status. Phonology, on the other hand, is concerned with the abstract, grammatical characterization of systems of sounds or signs.
The field of phonetics is a multilayered subject of linguistics that focuses on speech. In the case of oral languages there are three basic areas of study:
- Articulatory phonetics: the study of the production of speech sounds by the articulatory and vocal tract by the speaker.
- Acoustic phonetics: the study of the physical transmission of speech sounds from the speaker to the listener.
- Auditory phonetics: the study of the reception and perception of speech sounds by the listener.
These areas are inter-connected through the common mechanism of sound, such as wavelength (pitch), amplitude, and harmonics.
Phonetics was studied as early as the 3rd century BC in the Indian subcontinent, with Pāṇini's account of the place and manner of articulation of consonants in his treatise on Sanskrit. The major Indic alphabets today order their consonants according to Pāṇini's classification.
Modern phonetics begins with attempts—such as those of Joshua Steele (in Prosodia Rationalis, 1779) and Alexander Melville Bell (in Visible Speech, 1867)—to introduce systems of precise notation for speech sounds.
The study of phonetics grew quickly in the late 19th century partly due to the invention of phonograph, which allowed the speech signal to be recorded. Phoneticians were able to replay the speech signal several times and apply acoustic filters to the signal. In doing so, one was able to more carefully deduce the acoustic nature of the speech signal.
Using an Edison phonograph, Ludimar Hermann investigated the spectral properties of vowels and consonants. It was in these papers that the term formant was first introduced. Hermann also played vowel recordings made with the Edison phonograph at different speeds in order to test Willis', and Wheatstone's theories of vowel production.
Relation to phonology
In contrast to phonetics, phonology is the study of how sounds and gestures pattern in and across languages, relating such concerns with other levels and aspects of language. Phonetics deals with the articulatory and acoustic properties of speech sounds, how they are produced, and how they are perceived. As part of this investigation, phoneticians may concern themselves with the physical properties of meaningful sound contrasts or the social meaning encoded in the speech signal (socio-phonetics) (e.g. gender, sexuality, ethnicity, etc.). However, a substantial portion of research in phonetics is not concerned with the meaningful elements in the speech signal.
While it is widely agreed that phonology is grounded in phonetics, phonology is a distinct branch of linguistics, concerned with sounds and gestures as abstract units (e.g., distinctive features, phonemes, mora, syllables, etc.) and their conditioned variation (via, e.g., allophonic rules, constraints, or derivational rules). Phonology relates to phonetics via the set of distinctive features, which map the abstract representations of speech units to articulatory gestures, acoustic signals, and/or perceptual representations.
Phonetics as a research discipline has three main branches:
- articulatory phonetics is concerned with the articulation of speech: The position, shape, and movement of articulators or speech organs, such as the lips, tongue, and vocal folds.
- acoustic phonetics is concerned with acoustics of speech: The spectro-temporal properties of the sound waves produced by speech, such as their frequency, amplitude, and harmonic structure.
- auditory phonetics is concerned with speech perception: the perception, categorization, and recognition of speech sounds and the role of the auditory system and the brain in the same.
Phonetic transcription is a system for transcribing sounds that occur in a language, whether oral or sign. The most widely known system of phonetic transcription, the International Phonetic Alphabet (IPA), provides a standardized set of symbols for oral phones. The standardized nature of the IPA enables its users to transcribe accurately and consistently the phones of different languages, dialects, and idiolects. The IPA is a useful tool not only for the study of phonetics, but also for language teaching, professional acting, and speech pathology.
Applications of phonetics include:
- forensic phonetics: the use of phonetics (the science of speech) for forensic (legal) purposes.
- Speech Recognition: the analysis and transcription of recorded speech by a computer system.
- Experimental phonetics
- Index of phonetics articles
- International Phonetic Alphabet
- Speech processing
- Biometric word list
- Phonetics departments at universities
- ICAO spelling alphabet
- Buckeye Corpus
- SaypU (Spell As You Pronounce Universally)
- O'Grady (2005) p.15
- T.V.F. Brogan: English Versification, 1570–1980. Baltimore: Johns Hopkins University Press, 1981. E394.
- Alexander Melville Bell 1819-1905 . University at Buffalo, The State University of New York.
- Kingston, John. 2007. The Phonetics-Phonology Interface, in The Cambridge Handbook of Phonology (ed. Paul DeLacy), Cambridge University Press.
- Halle, Morris. 1983. On Distinctive Features and their articulatory implementation, Natural Language and Linguistic Theory, p. 91 - 105
- Jakobson, Roman, Gunnar Fant, and Morris Halle. 1976. Preliminaries to Speech Analysis: The Distinctive Features and their Correlates, MIT Press.
- Hall, T. Allen. 2001. Phonological representations and phonetic implementation of distinctive features, Mouton de Gruyter.
- O'Grady (2005) p.17
- International Phonetic Association (1999) Handbook of the International Phonetic Association. Cambridge University Press.
- Ladefoged, Peter (1975) A Course in Phonetics. Orlando: Harcourt Brace. 5th ed. Boston: Thomson/Wadsworth 2006.
- Ladefoged, Peter & Ian Maddieson (1996) The Sounds of the World’s Languages. Oxford: Blackwell.
- O'Grady, William, et al. (2005). Contemporary Linguistics: An Introduction (5th ed.). Bedford/St. Martin's. ISBN 0-312-41936-8.
- Stearns, Peter; Adas, Michael; Schwartz, Stuart; Gilbert, Marc Jason (2001). World Civilizations (3rd ed.). New York: Longman. ISBN 9780321044792.
|Wikisource has the text of The New Student's Reference Work article Phonetics.|
- the Web Site of the Phonetic Sciences Laboratory of the Université de Montréal.
- The International Society of Phonetic Sciences (ISPhS)
- A little encyclopedia of phonetics, Peter Roach, Professor of Phonetics, University of Reading, UK. (pdf)
- The sounds and sound patterns of language U Penn
- IPA handbook
- Real-time MRI video of the articulation of speech sounds, from the USC Speech Articulation and kNowledge (SPAN) Group
- Extensive collection of phonetics resources on the Web (University of North Carolina)
- Phonetics and Phonology (University of Osnabrück)
- UCLA Phonetics Laboratory Archive Audio recordings illustrating phonetic structures from over 200 languages with phonetic transcriptions, with scans of original field notes where relevant