# Dynamic syntax

Dynamic Syntax (DS) is a grammar formalism and linguistic theory whose overall aim is to explain the real-time twin processes of language understanding and production. Under the DS approach, syntactic knowledge is understood as the ability to incrementally analyse the structure and content of spoken and written language in context and in real-time. While it posits representations similar to those used in Combinatory Categorial Grammars (CCG), it builds those representations left-to-right going word-by-word. Thus it differs from other syntactic models which generally abstract way from features of everyday conversation such as interruption, backtracking, and self-correction. Moreover, it differs from other approaches in that it does not postulate an independent level of syntactic structure over words.

DS emerged in the late 1990s and early 2000s through the work of prominent figures such as Ruth Kempson, Ronnie Cann, Wilfried Meyer-Viol and Dov Gabbay. The first monograph-length work in the framework was released in 2001 under the title Dynamic Syntax: the flow of understanding. It was embedded in wider trends in linguistic thinking of the 20th century, especially in syntax, semantics, pragmatics and phonology. The Dynamics of Language by Ronnie Cann, Ruth Kempson and Lutz Marten followed on from the 2001 title and expanded the discussion and empirical coverage of the framework.

Subsequent years saw an expansion of the empirical coverage of the framework to modelling structures in Japanese, Korean, dialects of Modern Greek, Medieval Spanish and a variety of Bantu languages including Swahili, Rangi and siSwati. More recent work has also explored the way in which the framework can naturally be expanded to model dialogue.

## Theoretical assumptions

While most grammar formalisms characterise properties of strings of words, in Dynamic Syntax it is propositional structure which is characterised. Propositional structure is modelled through recourse of binary semantic trees. Propositional structure is built up on a strictly incremental manner on a left-to-right basis and is represented through processes of tree growth. Under this framework, syntactic knowledge is considered to be the knowledge to parse/process strings in context. A consequence of this is that it is not just the final tree that is important for representational purposes, but all of the intermediate stages of the parsing/production process. Similarly, since the trees represent propositions, the same tree represents the same proposition (albeit represented through different sentences) from different languages.

The framework assumes a single level of representation. This contrasts with frameworks such as Lexical Functional Grammar in which multiple structures are posited. Similarly, no movement operations are considered necessary, unlike in Minimalism or other Generative approaches.

## Parts of the formalism

Dynamic Syntax constitutes several core components: semantic formulae and composition calculus (epsilon calculus within typed lambda calculus), trees (lambda application ordering), and tree building actions (lexical and computational actions).

### Semantic formulae and compositional calculus

The semantic formulae which classical Dynamic Syntax generates are a combination of Epsilon calculus formulae and Lambda calculus terms (in recent years DS-TTR has been developed alongside DS where Record Types from the formalism Type Theory with Records (TTR)) are used - see Purver et al. (2011).[1]

The formulae are either simple first order logic constants such as ${\displaystyle john'}$, predicate terms such as ${\displaystyle run'(john')}$ or functions such as ${\displaystyle \lambda x.run'(x)}$. Normal lambda calculus substitution (${\displaystyle \beta }$-reduction) means a function can be applied to a simple term to return a predicate such that ${\displaystyle (\lambda x.run'(x))~john'=run'(john')}$. The Epsilon calculus extension to first order logic is implemented in quantifiers, where ${\displaystyle (\exists x)A(x)\ \equiv \ A(\epsilon x\ A)}$, e.g. the string "a boy" may result in the formula ${\displaystyle boy'(\epsilon x\ boy')}$ being generated.

### Tree growth

One of the basic assumptions behind DS is that natural language syntax can be seen as the progressive accumulation of transparent semantic representations with the upper goal being the construction of a logical propositional formula (a formula of type t). This process is driven by means of monotonic tree growth, representing the attempt to model the way information is processed in a time-linear, incremental, word-to-word manner. Tree growth is driven by means of requirements (indicated by the question mark (?)).[2]

Tree growth can take place in three ways: through computational rules, lexical input and pragmatic enrichment.

Computational rules involve an input and an output. These are considered to be universally available across languages. Given the right input the corresponding computational rule can – although need not – apply. This contrasts with lexical input which is lexically supplied and therefore language-specific.

### The language of trees

The language of representation in Dynamic Syntax consists of binary trees. These trees are underpinned by the Logic Of Finite Trees (LOFT, Blackburn & Meyer-Viol 1994). LOFT is an expressive modal language that allows statements to be made about any treenode from the perspective of any treenode. LOFT uses two basic tree modalities, the up and down arrow relations. These correspond to the daughter and mother relations. Left nodes are addressed as 0 nodes and right nodes are 1 nodes. By convention, nodes on the left correspond to argument nodes, i.e. nodes in which arguments are represented, whereas right nodes correspond to the functor nodes, i.e. nodes in which all the various types of predicates are represented. The rootnode is given the treenode address 0 and it is defined as the sole node that does not have a mother node.[2]

## Conference series

The First Dynamic Syntax conference was held at SOAS University of London in April 2017. Prior to this there was a meeting of mainly Dynamic Syntax practitioners at Ghent University in Belgium. The Second Dynamic Syntax Conference was held at the University of Edinburg in 2018. The 3rd Dynamic Syntax conference was held at the University of Malta in May 2019. The 4th Dynamic Syntax conference will be held at the University of Oxford in May 2020.

A Dynamic Syntax course was held at ESSLLI in 2019. A phd level course will be held at Bergen University in May 2020. Dynamic Syntax has been taught at institutions around the world including SOAS, Kings College London, the University of Essex in the UK, as well as at institutions in China.

## References

1. ^ Purver, M., Eshghi, A., & Hough, J. (2011, January). Incremental semantic construction in a dialogue system. In Proceedings of the Ninth International Conference on Computational Semantics (pp. 365-369). Association for Computational Linguistics.
2. ^ a b Chatzikyriakidis, Stergios; Gibson, Hannah (3 February 2017). "The Bantu-Romance-Greek connection revisited: Processing constraints in auxiliary and clitic placement from a cross-linguistic perspective". Glossa: A Journal of General Linguistics. 2 (1): 4. doi:10.5334/gjgl.135. Material was copied from this source, which is available under a Creative Commons Attribution 4.0 International License.

## Sources

• Blackburn, Patrick and Wilfried Meyer-Viol. 1994. Linguistics, Logic and Finite Trees. Logic Journal of the IGPL. 2(1): 3–29.
• Cann R, R Kempson, L Marten (2005) The dynamics of language. Oxford: Elsevier.
• Kempson R, W Meyer-Viol, D Gabbay (2001) Dynamic syntax. Oxford: Blackwell.
• Kempson, Ruth M., Eleni Gregoromichelaki, and Christine Howes, eds (2011). The dynamics of lexical interfaces. CSLI Publications/Center for the Study of Language and Information.