|Part of a series on|
Constraint-based grammars can perhaps be best understood in contrast to generative grammars. Whereas a generative grammar lists all the transformations, merges, movements, and deletions that can result in all well-formed sentences, constraint-based grammars take the opposite approach: allowing anything that is not otherwise constrained.
"The grammar is nothing but a set of constraints that structures are required to satisfy in order to be considered well-formed." "A constraint-based grammar is more like a database or a knowledge representation system than it is like a collection of algorithms."
Examples of such grammars include
- the non-procedural variant of Transformational grammar (TG) of George Lakoff, that formulates constraints on potential tree sequences
- Johnson and Postal’s formalization of Relational grammar (RG) (1980), Generalized phrase structure grammar (GPSG) in the variants developed by Gazdar et al. (1988), Blackburn et al. (1993) and Rogers (1997)
- Lexical functional grammar (LFG) in the formalization of Ronald Kaplan (1995)
- Head-driven phrase structure grammar (HPSG) in the formalization of King (1999)
- Constraint Handling Rules (CHR) grammars
- Pollard, Carl. "The nature of constraint-based grammar" (PDF). 11th Pacific Asian conference on language, information and computation.
- Müller, Stefan (2016). Grammatical theory: From transformational grammar to constraint-based approaches. Berlin: Language Science Press. pp. 490–491.
- Christiansen, Henning. "CHR Grammars with multiple constraint stores." First Workshop on Constraint Handling Rules: Selected Contributions. Universität Ulm, Fakultät für Informatik, 2004.
|This grammar-related article is a stub. You can help Wikipedia by expanding it.|