Dynamic semanticsDynamic semantics is a framework in logic and natural language semantics that treats the meaning of a sentence as its potential to update a context. In static semantics, knowing the meaning of a sentence amounts to knowing when it is true; in dynamic semantics, knowing the meaning of a sentence means knowing "the change it brings about in the information state of anyone who accepts the news conveyed by it." In dynamic semantics, sentences are mapped to functions called context change potentials, which take an input context and return an output context.
Discourse representation theoryIn formal linguistics, discourse representation theory (DRT) is a framework for exploring meaning under a formal semantics approach. One of the main differences between DRT-style approaches and traditional Montagovian approaches is that DRT includes a level of abstract mental representations (discourse representation structures, DRS) within its formalism, which gives it an intrinsic ability to handle meaning across sentence boundaries. DRT was created by Hans Kamp in 1981.
Categorial grammarCategorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and semantic composition, since it typically treats syntactic categories as corresponding to semantic types. Categorial grammars were developed in the 1930s by Kazimierz Ajdukiewicz and in the 1950s by Yehoshua Bar-Hillel and Joachim Lambek.
LinguisticsLinguistics is the scientific study of language. The modern-day scientific study of linguistics takes all aspects of language into account — i.e., the cognitive, the social, the cultural, the psychological, the environmental, the biological, the literary, the grammatical, the paleographical, and the structural. Linguistics is based on a theoretical as well as descriptive study of language, and is also interlinked with the applied fields of language studies and language learning, which entails the study of specific languages.
Argument (linguistics)In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other can also be construed as predicates and as arguments.
Modality (linguistics)In linguistics and philosophy, modality refers to the ways language can express various relationships to reality or truth. For instance, a modal expression may convey that something is likely, desirable, or permissible. Quintessential modal expressions include modal auxiliaries such as "could", "should", or "must"; modal adverbs such as "possibly" or "necessarily"; and modal adjectives such as "conceivable" or "probable".
LogicLogic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the science of deductively valid inferences or logical truths. It studies how conclusions follow from premises due to the structure of arguments alone, independent of their topic and content. Informal logic is associated with informal fallacies, critical thinking, and argumentation theory. It examines arguments expressed in natural language while formal logic uses formal language.
Anaphora (linguistics)In linguistics, anaphora (əˈnæfərə) is the use of an expression whose interpretation depends upon another expression in context (its antecedent or postcedent). In a narrower sense, anaphora is the use of an expression that depends specifically upon an antecedent expression and thus is contrasted with cataphora, which is the use of an expression that depends upon a postcedent expression. The anaphoric (referring) term is called an anaphor. For example, in the sentence Sally arrived, but nobody saw her, the pronoun her is an anaphor, referring back to the antecedent Sally.
Conditional sentenceConditional sentences are natural language sentences that express that one thing is contingent on something else, e.g. "If it rains, the picnic will be cancelled." They are so called because the impact of the main clause of the sentence is conditional on the dependent clause. A full conditional thus contains two clauses: a dependent clause called the antecedent (or protasis or if-clause), which expresses the condition, and a main clause called the consequent (or apodosis or then-clause) expressing the result.
DenotationIn linguistics and philosophy, the denotation of an expression is its literal meaning. For instance, the English word "warm" denotes the property of being warm. Denotation is contrasted with other aspects of meaning including connotation. For instance, the word "warm" may evoke calmness or coziness, but these associations are not part of the word's denotation. Similarly, an expression's denotation is separate from pragmatic inferences it may trigger.