|Part of a series on|
Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.
The roots of cognitive linguistics are in Noam Chomsky's 1959 critical review of B. F. Skinner's Verbal Behavior. Chomsky's rejection of behavioural psychology and his subsequent anti-behaviourist activity helped bring about a shift of focus from empiricism to mentalism in psychology under the new concepts of cognitive psychology and cognitive science.
Chomsky considered linguistics as a subfield of cognitive science in the 1970s but called his model transformational or generative grammar. Having been engaged with Chomsky in the linguistic wars, George Lakoff united in the early 1980s with Ronald Langacker and other advocates of neo-Darwinian linguistics in a so-called "Lakoff–Langacker agreement". It is suggested that they picked the name "cognitive linguistics" for their new framework to undermine the reputation of generative grammar as a cognitive science.
Consequently, there are three competing approaches that today consider themselves as true representatives of cognitive linguistics. One is the Lakoffian–Langackerian brand with capitalised initials (Cognitive Linguistics). The second is generative grammar, while the third approach is proposed by scholars whose work falls outside the scope of the other two. They argue that cognitive linguistics should not be taken as the name of a specific selective framework, but as a whole field of scientific research that is assessed by its evidential rather than theoretical value.
Generative grammar functions as a source of hypotheses about language computation in the mind and brain. It is argued to be the study of 'the cognitive neuroscience of language'. Generative grammar studies behavioural instincts and the biological nature of cognitive-linguistic algorithms, providing a computational–representational theory of mind.
This in practice means that sentence analysis by linguists is taken as a way to uncover cognitive structures. It is argued that a random genetic mutation in humans has caused syntactic structures to appear in the mind. Therefore, the fact that people have language does not rely on its communicative purposes.
For a famous example, it was argued by linguist Noam Chomsky that sentences of the type "Is the man who is hungry ordering dinner" are so rare that it is unlikely that children will have heard them. Since they can nonetheless produce them, it was further argued that the structure is not learned but acquired from an innate cognitive language component. Generative grammarians then took as their task to find out all about innate structures through introspection in order to form a picture of the hypothesised language faculty.
Generative grammar promotes a modular view of the mind, considering language as an autonomous mind module. Thus, language is separated from mathematical logic to the extent that inference plays no role in language acquisition. The generative conception of human cognition is also influential in cognitive psychology and computer science.
Cognitive Linguistics (linguistics framework)
One of the approaches to cognitive linguistics is called Cognitive Linguistics, with capital initials, but it is also often spelled cognitive linguistics with all lowercase letters. This movement saw its beginning in early 1980s when George Lakoff's metaphor theory was united with Ronald Langacker's Cognitive Grammar, with subsequent models of Construction Grammar following from various authors. The union entails two different approaches to linguistic and cultural evolution: that of the conceptual metaphor, and the construction.
Cognitive Linguistics defines itself in opposition to generative grammar, arguing that language functions in the brain according to general cognitive principles. Lakoff's and Langacker's ideas are applied across sciences. In addition to linguistics and translation theory, Cognitive Linguistics is influential in literary studies, education, sociology, musicology, computer science and theology.
Conceptual metaphor theory
According to American linguist George Lakoff, metaphors are not just figures of speech, but modes of thought. Lakoff hypothesises that principles of abstract reasoning may have evolved from visual thinking and mechanisms for representing spatial relations that are present in lower animals. Conceptualisation is regarded as being based on the embodiment of knowledge, building on physical experience of vision and motion. For example, the 'metaphor' of emotion builds on downward motion while the metaphor of reason builds on upward motion, as in saying “The discussion fell to the emotional level, but I raised it back up to the rational plane." It is argued that language does not form an independent cognitive function but fully relies on other cognitive skills which include perception, attention, motor skills, and visual and spatial processing. Same is said of various other cognitive phenomena such as the sense of time:
- "In our visual systems, we have detectors for motion and detectors for objects/locations. We do not have detectors for time (whatever that could mean). Thus, it makes good biological sense that time should be understood in terms of things and motion." —George Lakoff
In Cognitive Linguistics, thinking is argued to be mainly automatic and unconscious. Cognitive linguists study the embodiment of knowledge by seeking expressions which relate to modal schemas. For example, in the expression "It is quarter to eleven", the preposition to represents a modal schema which is manifested in language as a visual or sensorimotoric 'metaphor'.
Cognitive and construction grammar
Constructions, as the basic units of grammar, are conventionalised form–meaning pairings which are comparable to memes as units of linguistic evolution. These are considered multi-layered. For example, idioms are higher-level constructions which contain words as middle-level constructions, and these may contain morphemes as lower-level constructions. It is argued that humans do not only share the same body type, allowing a common ground for embodied representations; but constructions provide common ground for uniform expressions within a speech community. Like biological organisms, constructions have life cycles which are studied by linguists.
According to the cognitive and constructionist view, there is no grammar in the traditional sense of the word. What is commonly perceived as grammar is an inventory of constructions; a complex adaptive system; or a population of constructions. Constructions are studied in all fields of language research from language acquisition to corpus linguistics.
Integrative cognitive linguistics
There is also a third approach to cognitive linguistics, which neither directly supports the modular (Generative Grammar) nor the anti-modular (Cognitive Linguistics) view of the mind. Proponents of the third view argue that, according to brain research, language processing is specialized although not autonomous from other types of information processing. Language is thought of as one of the human cognitive abilities, along with perception, attention, memory, motor skills, and visual and spatial processing, rather than being subordinate to them. Emphasis is laid on a cognitive semantics that studies the contextual–conceptual nature of meaning.
Cognitive perspective on natural language processing
Cognitive linguistics offers a scientific first principle direction for quantifying states-of-mind through natural language processing.  As mentioned earlier Cognitive Linguistics, approaches grammar with a nontraditional view. Traditionally grammar has been defined as a set of structural rules governing the composition of clauses, phrases and words in a natural language. From the perspective of Cognitive Linguistics, grammar is seen as the rules of arrangement of language which best serve communication of the experience of the human organism through its cognitive skills which include perception, attention, motor skills, and visual and spatial processing. Such rules are derived from observing the conventionalized pairings of meaning to understand sub-context in the evolution of language patterns.  The cognitive approach to identifying sub-context by observing what comes before and after each linguistic construct provides a grounding of meaning in terms of sensorimotoric embodied experience.  When taken together, these two perspectives form the basis of defining approaches in computational linguistics with strategies to work through the symbol grounding problem which posits that, for a computer, a word is merely a symbol, which is a symbol for another symbol and so on in an unending chain without grounding in human experience.  The broad set of tools and methods of computational linguistics are available as natural language processing or NLP. Cognitive linguistics adds a new set of capabilities to NLP. These cognitive NLP methods enable software to analyze sub-context in terms of internal embodied experience. 
The goal of natural language processing (NLP) is to enable a computer to "understand" the contents of text and documents, including the contextual nuances of the language within them. The perspective of traditional Traditional Chomskyan Linguistics offers NLP three approaches or methods to identify and quantify the literal contents, the who, what, where and when in text – in linguistic terms, the semantic meaning or semantics of the text. The perspective of cognitive linguistics offers NLP a direction to identify and quantify the contextual nuances, the why and how in text – in linguistics terms, the implied pragmatic meaning or pragmatics of text.
The three NLP approaches to understanding literal semantics in text based on traditional linguistics are symbolic NLP, statistical NLP, and neural NLP. The first method, symbolic NLP (1950s – early 1990s) is based on first principles and rules of traditional linguistics. The second method, statistical NLP (1990s–2010s), builds upon the first method with a layer of human curated & machine-assisted corpora for multiple contexts. The third approach neural NLP (2010 onwards), builds upon the earlier methods by leveraging advances in deep neural network-style methods to automate tabulation of corpora & parse models for multiple contexts in shorter periods of time.  All three methods are used to power NLP techniques like stemming and lemmatisation in order to obtain statistically relevant listing of the who, what, where & when in text through named-entity recognition and Topic model programs. The same methods have been applied with NLP techniques like a bag-of-words model to obtain statistical measures of emotional context through sentiment analysis programs. The accuracy of a sentiment analysis system is, in principle, how well it agrees with human judgments.  Because evaluation of sentiment analysis is becoming more and more specialty based, each implementation needs a separate training model and specialized human verification raising Inter-rater reliability issues. However, the accuracy is considered generally acceptable for use in evaluating emotional context at a statistical or group level.  
A developmental trajectory of NLP to understand contextual pragmatics in text involving emulating intelligent behavior and apparent comprehension of natural language is cognitive NLP. This method is a rules based approach which involves assigning meaning to a word, phrase, sentence or piece of text based on the information presented before and after the piece of text being analyzed.
The specific meaning of cognitive linguistics, the proper address of the name, and the scientific status of the enterprise have been called into question. Criticism includes an overreliance on introspective data, a lack of experimental testing of hypotheses and little integration of findings from other fields of cognitive science. Some researchers go as far as to consider calling the field 'cognitive' at all a misnomer.
"It would seem to me that [cognitive linguistics] is the sort of linguistics that uses findings from cognitive psychology and neurobiology and the like to explore how the human brain produces and interprets language. In other words, cognitive linguistics is a cognitive science, whereas Cognitive Linguistics is not. Most of generative linguistics, to my mind, is not truly cognitive either."— Bert Peeters
There has been criticism regarding the brain-related claims of both Chomsky's generative grammar, and Lakoff's Cognitive Linguistics. These are said to advocate too extreme views on the axis of modular versus general processing. The empirical evidence points to language being partially specialized and interacting with other systems. However, to counter behaviorism, Chomsky postulated that language acquisition occurs inside an autonomous module, which he calls the language faculty, thus suggesting a very high degree of specialization of language in the brain. To offer an alternative to his view, Lakoff, in turn, postulated the opposite by claiming that language acquisition is not specialized at all because language does not constitute a cognitive capacity of its own but occurs in the sensory domains such as vision and kinesthesis. According to the critical view, these ideas were not motivated by brain research but by a struggle for power in linguistics. Members of such frameworks are also said to have used other researchers' findings to present them as their own work. While this criticism is accepted for most part, it is claimed that some of the research has nonetheless produced useful insights.
- Robinson, Peter (2008). Handbook of Cognitive Linguistics and Second Language Acquisition. Routledge. pp. 3–8. ISBN 978-0-805-85352-0.
- Peeters, Bert (1998). "Cognitive musings". Word. 49 (2): 225–237. doi:10.1080/00437956.1998.11673884.
- Schwarz-Friesel, Monika (2012). "On the status of external evidence in the theories of cognitive linguistics". Language Sciences. 34 (6): 656–664. doi:10.1016/j.langsci.2012.04.007.
- Greenwood, John D (1999). "Understanding the 'cognitive revolution' in psychology". Journal of the History of the Behavioral Sciences. 35 (1): 1–22. doi:10.1002/(SICI)1520-6696(199924)35:1<1::AID-JHBS1>3.0.CO;2-4. Retrieved 2020-02-22.
- Harris, Randy Allen (1995). The Linguistics Wars. Oxford: OUP. ISBN 978-0-19-983906-3.
- Peeters, Bert (2001). "Does cognitive linguistics live up to its name?". In Dirven, René (ed.). Language and Ideology, Vol.1: Theoretical Cognitive Approaches. John Benjamins. pp. 83–106. ISBN 978-90-272-9954-3.
- Marantz, Alec (2005). "Generative linguistics within the cognitive neuroscience of language". The Linguistic Review. 22 (2–4): 492–445. CiteSeerX 10.1.1.718.940. doi:10.1515/tlir.2005.22.2-4.429. S2CID 8727463.
- Boeckx, Cedric (2005). "Generative Grammar and modern cognitive science" (PDF). Journal of Cognitive Science. 6: 45–54. Retrieved 2020-06-01.[permanent dead link]
- Hauser, Mark D.; Yang, Charles; Berwick, Robert C.; Tattersall, Ian; Ryan, Michael J.; Watumull, Jeffrey; Chomsky, Noam; Lewontin, Richard C. (2014). "The mystery of language evolution". Frontiers in Psychology. 5: 401. doi:10.3389/fpsyg.2014.00401. PMC 4019876. PMID 24847300.
- Berwick, Robert C.; Chomsky, Noam (2015). Why Only Us: Language and Evolution. MIT Press. ISBN 978-0-262-03424-1.
- Pullum, Geoffrey; Scholz, Barbara (2002). "Empirical assessment of stimulus poverty arguments" (PDF). The Linguistic Review. 18 (1–2): 9–50. doi:10.1515/tlir.19.1-2.9. S2CID 143735248. Archived from the original (PDF) on 2021-02-03. Retrieved 2020-02-28.
- Prefors, Amy; Tenenbaum, Joshua; Regier, Terry (2006). "Poverty of the stimulus? A rational approach" (PDF). Proceedings of the Annual Meeting of the Cognitive Science Society. 28. ISSN 1069-7977. Archived from the original (PDF) on 2020-11-11. Retrieved 2020-02-28.
- Smith, Neil (2002). Chomsky: Ideas and Ideals (2nd ed.). Cambridge University Press. ISBN 0-521-47517-1.
- Sudkamp, Thomas A. (1997). Languages and machines: an Introduction to the Theory of Computer Science. Addison-Wesley Longman. p. 569. ISBN 978-0-201-82136-9. Retrieved 2020-06-01.
- Croft, William; Cruse, Alan (2004). Cognitive Linguistics. Cambridge University Press. ISBN 978-0-511-80386-4.
- Croft, William; Cruse, Alan (2004). Cognitive Linguistics. Cambridge University Press. ISBN 978-0-511-80386-4.
- Harrison, Chloe; Nuttall, Louise; Stockwell, Peter; Yuan, Wenjuan (2014). "Introduction". In Harrison, Chloe; Nuttall, Louise; Stockwell, Peter; Yuan, Wenjuan (eds.). Cognitive Grammar in Literature. John Benjamins. pp. 1–16. ISBN 978-90-272-7056-6.
- Corni, F; Fuchs, H U; Dumont, E (2019). "Conceptual metaphor in physics education: roots of analogy, visual metaphors, and a primary physics course for student teachers". Journal of Physics: Conference Series. 1286 (GIREP-ICPE-EPEC 2017 Conference 3–7 July 2017): 012059. Bibcode:2019JPhCS1286a2059C. doi:10.1088/1742-6596/1286/1/012059.
- Cerulo, Karen A. (2019). "Embodied cognition: sociologgy's role in bridging mind, brain, and body". In Brekhus, Wayne H.; Ignatow, Gabe (eds.). The Oxford Handbook of Cognitive Sociology. Oxford University Press. pp. 81–100. doi:10.1093/oxfordhb/9780190273385.013.5. Retrieved 2020-05-31.
- Spitzer, Michael (2004). Metaphor and Musical Thought. University of Chicago Press Press. ISBN 0-226-769720.
- Mondal, Prakash (2009). "How language processing constrains (computational) natural language processing: a cognitive perspective" (PDF). 23rd Pacific Asia Conference on Language, Information and Computation: 365–374. Retrieved 2020-05-31.
- Feyaerts, Kurt; Boeve, Lieven (2018). "Religious metaphors at the crossroads between apophatical theology and Cognitive Linguistics: an interdisciplinary study". In Chilton, Paul; Kopytowska, Monika (eds.). Religion, Language, and the Human Mind. Oxford University Press Press. ISBN 978-0-19-063664-7.
- Lakoff, George (1990). "Invariance hypothesis: is abstract reasoning based on image-schemas?". Cognitive Linguistics. 1 (1): 39–74. doi:10.1515/cogl.1922.214.171.124. S2CID 144380802.
- Lakoff, George; Johnson, Mark (1980). Metaphors We Live By. University of Chicago Press. ISBN 978-0-226-46801-3.
- Lakoff, George; Johnson, Mark (1999). Philosophy in the flesh : the embodied mind and its challenge to Western thought. Basic Books. ISBN 0-465-05673-3.
- Ibarretxe-Antuñano, Iraide (2002). "MIND-AS-BODY as a Cross-linguistic Conceptual Metaphor". Miscelánea. 25 (1): 93–119. Retrieved 2020-07-15.
- Gibbs, R. W.; Colston, H. (1995). "The cognitive psychological reality of image schemas and their transformations". Cognitive Linguistics. 6 (4): 347–378. doi:10.1515/cogl.19126.96.36.1997. S2CID 144424435.
- Luodonpää-Manni, Milla; Penttilä, Esa; Viimaranta, Johanna (2017). "Introduction". In Luodonpää-Manni, Milla; Viimaranta, Johanna (eds.). Empirical Approaches to Cognitive Linguistics: Aalyzing Real-Life Data. Cambridge University Press. ISBN 978-1-4438-7325-3. Archived from the original on 2020-10-23. Retrieved 2020-06-30.
- Dahl, Östen (2001). "Grammaticalization and the life cycles of constructions". RASK – Internationalt Tidsskrift for Sprog og Kommunikation. 14: 91–134.
- Kirby, Simon (2013). "Transitions: the evolution of linguistic replicators". In Binder; Smith (eds.). The Language Phenomenon (PDF). The Frontiers Collection. Springer. pp. 121–138. doi:10.1007/978-3-642-36086-2_6. ISBN 978-3-642-36085-5. Retrieved 2020-03-04.
- Zehentner, Eva (2019). Competition in Language Change: the Rise of the English Dative Alternation. De Gruyter Mouton. ISBN 978-3-11-063385-6.
- MacWhinney, Brian (2015). "Introduction – language emergence". In MacWhinney, Brian; O'Grady, William (eds.). Handbook of Language Emergence. Wiley. pp. 1–31. ISBN 978-1-118-34613-6.
- Clark, Eve (2015). "Common ground". In MacWhinney, Brian; O'Grady, William (eds.). Handbook of Language Emergence. Wiley. pp. 1–31. ISBN 978-1-118-34613-6.
- Ellis, Nick C. (2011). "The emergence of language as a Complex Adaptive System". In Simpson, James (ed.). Routledge Handbook of Applied Linguistics. pp. 666–679. CiteSeerX 10.1.1.456.3740. ISBN 978-0-203-83565-4.
- Arbib, Michael A. (2008). "Holophrasis and the protolanguage spectrum". In Arbib, Michael A.; Bickerton, Derek (eds.). The Emergence of Protolanguage. pp. 666–679. ISBN 978-90-272-8782-3.
- Schwarz-Friesel, Monika (2008). Einführung in die Kognitive Linguistik. Dritte, aktualisierte und erweiterte Auflage. Francke. ISBN 978-3-8252-1636-8.
- Kjell (2019). "Semantic measures: Using natural language processing to measure, differentiate, and describe psychological constructs". Psychological Methods. 24 (1): 92–115. doi:10.1037/met0000191. PMID 29963879. S2CID 49642731.
- Vogt, Paul. "Language evolution and robotics: issues on symbol grounding and language acquisition." Artificial cognition systems. IGI Global, 2007. 176–209.
- Goldberg, Yoav (2016). "A Primer on Neural Network Models for Natural Language Processing". Journal of Artificial Intelligence Research. 57: 345–420. arXiv:1807.10854. doi:10.1613/jair.4992. S2CID 8273530.
- Goodfellow, Ian; Bengio, Yoshua; Courville, Aaron (2016). Deep Learning. MIT Press.
- Roebuck, K. (2012-10-24). Sentiment Analysis: High-impact Strategies - What You Need to Know: Definitions, Adoptions, Impact, Benefits, Maturity, Vendors. ISBN 9781743049457.[permanent dead link]
- Karlgren, Jussi, Magnus Sahlgren, Fredrik Olsson, Fredrik Espinoza, and Ola Hamfors. "Usefulness of sentiment analysis." In European Conference on Information Retrieval, pp. 426-435. Springer Berlin Heidelberg, 2012.
- Karlgren, Jussi. "Affect, appeal, and sentiment as factors influencing interaction with multimedia information." In Proceedings of Theseus/Image CLEF workshop on visual information retrieval evaluation, pp. 8-11. 2009.
- Dąbrowska, Ewa (2016). "Cognitive Linguistics' seven deadly sins" (PDF). Cognitive Linguistics. 27 (4): 479–491. doi:10.1515/cog-2016-0059.
- Gibbs, Raymond W. Jr. (2013). "The real complexities of psycholinguistic research on metaphor". Language Sciences. 40: 45–52. doi:10.1016/j.langsci.2013.03.001.