Generative grammar, or generativism ˈdʒɛnərətɪvɪzəm, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving ultimately from glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may apply repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.
Generativists have created numerous theories to make the NP VP (NP) analysis work in natural language description. That is, the subject and the verb phrase appearing as independent constituents, and the object placed within the verb phrase. A main point of interest remains in how to appropriately analyse Wh-movement and other cases where the subject appears to separate the verb from the object. Although claimed by generativists as a cognitively real structure, neuroscience has found no evidence for it. In other words, generative grammar encompasses proposed models of linguistic cognition; but there is still no specific indication that these are quite correct. Recent arguments have been made that the success of large language models undermine key claims of generative syntax because they are based on markedly different assumptions, including gradient probability and memorized constructions, and out-perform generative theories both in syntactic structure and in integration with cognition and neuroscience.
There are a number of different approaches to generative grammar.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Should have expertise in chemistry, physics or lite and material sciences. Although a very good knowledge in Al-based
algorithms is required to fully understand the technical details, a basic knowledg
Linguistics is the scientific study of language. The modern-day scientific study of linguistics takes all aspects of language into account — i.e., the cognitive, the social, the cultural, the psychological, the environmental, the biological, the literary, the grammatical, the paleographical, and the structural. Linguistics is based on a theoretical as well as descriptive study of language, and is also interlinked with the applied fields of language studies and language learning, which entails the study of specific languages.
Lexical functional grammar (LFG) is a constraint-based grammar framework in theoretical linguistics. It posits two separate levels of syntactic structure, a phrase structure grammar representation of word order and constituency, and a representation of grammatical functions such as subject and object, similar to dependency grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the theory of transformational grammar which was current in the late 1970s.
Deep structure and surface structure (also D-structure and S-structure although those abbreviated forms are sometimes used with distinct meanings) are concepts used in linguistics, specifically in the study of syntax in the Chomskyan tradition of transformational generative grammar. The deep structure of a linguistic expression is a theoretical construct that seeks to unify several related structures. For example, the sentences "Pat loves Chris" and "Chris is loved by Pat" mean roughly the same thing and use similar words.
Generative language models (LMs) have become omnipresent across data science. For a wide variety of tasks, inputs can be phrased as natural language prompts for an LM, from whose output the solution can then be extracted. LM performance has consistently be ...
Assoc Computing Machinery2024
, , , ,
This paper investigates the potential impact of deep generative models on the work of creative professionals. We argue that current generative modeling tools lack critical features that would make them useful creativity support tools, and introduce our own ...
2024
We present syntax rewriting rules that translate Scala 2 code into Scala 3. Two major syntactic changes are introduced: new control structure syntax and significant indentation. We describe the design and the implementation of these rules and evaluate thei ...