Concept

Model-theoretic grammar

Model-theoretic grammars, also known as constraint-based grammars, contrast with generative grammars in the way they define sets of sentences: they state constraints on syntactic structure rather than providing operations for generating syntactic objects. A generative grammar provides a set of operations such as rewriting, insertion, deletion, movement, or combination, and is interpreted as a definition of the set of all and only the objects that these operations are capable of producing through iterative application. A model-theoretic grammar simply states a set of conditions that an object must meet, and can be regarded as defining the set of all and only the structures of a certain sort that satisfy all of the constraints. The approach applies the mathematical techniques of model theory to the task of syntactic description: a grammar is a theory in the logician's sense (a consistent set of statements) and the well-formed structures are the models that satisfy the theory. David E. Johnson and Paul M. Postal introduced the idea of model-theoretic syntax in their 1980 book Arc Pair Grammar. The following is a sample of grammars falling under the model-theoretic umbrella: the non-procedural variant of Transformational grammar (TG) of George Lakoff, that formulates constraints on potential tree sequences Johnson and Postal's formalization of Relational grammar (RG) (1980), Generalized phrase structure grammar (GPSG) in the variants developed by Gazdar et al. (1988), Blackburn et al. (1993) and Rogers (1997) Lexical functional grammar (LFG) in the formalization of Ronald Kaplan (1995) Head-driven phrase structure grammar (HPSG) in the formalization of King (1999) Constraint Handling Rules (CHR) grammars The implicit model underlying The Cambridge Grammar of the English Language One benefit of model-theoretic grammars over generative grammars is that they allow for gradience in grammaticality. A structure may deviate only slightly from a theory or it may be highly deviant.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (2)
Related publications (9)

Quand les artistes font forme en habitant ensemble. Usages, présences, imaginaires.

Mathilde Coline Chénin

Taking root within a historical and transversal perspective of communities of artists at work, and of the relationship that they nourish with the places that host their creative activity, the present research intends to bring to light the grammars of "the ...
EPFL2022

Gramatron: Effective Grammar-Aware Fuzzing

Mathias Josef Payer

Fuzzers aware of the input grammar can explore deeper program states using grammar-aware mutations. Existing grammar-aware fuzzers are ineffective at synthesizing complex bug triggers due to: (i) grammars introducing a sampling bias during input generation ...
ASSOC COMPUTING MACHINERY2021

On Linear Interpolation in the Latent Space of Deep Generative Models

Quentin Christian Becker, Mike Yan Michelis

The underlying geometrical structure of the latent space in deep generative models is in most cases not Euclidean, which may lead to biases when comparing interpolation capabilities of two models. Smoothness and plausibility of linear interpolations in lat ...
2021
Show more
Related concepts (1)
Transformational grammar
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations (called transformations) to produce new sentences from existing ones. The method is commonly associated with American linguist Noam Chomsky.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.