Natural-language understandingNatural-language understanding (NLU) or natural-language interpretation (NLI) is a subtopic of natural-language processing in artificial intelligence that deals with machine reading comprehension. Natural-language understanding is considered an AI-hard problem. There is considerable commercial interest in the field because of its application to automated reasoning, machine translation, question answering, news-gathering, text categorization, voice-activation, archiving, and large-scale content analysis.
Natural language generationNatural language generation (NLG) is a software process that produces natural language output. A widely-cited survey of NLG methods describes NLG as "the subfield of artificial intelligence and computational linguistics that is concerned with the construction of computer systems than can produce understandable texts in English or other human languages from some underlying non-linguistic representation of information". While it is widely agreed that the output of any NLG process is text, there is some disagreement about whether the inputs of an NLG system need to be non-linguistic.
Relevance (information retrieval)In information science and information retrieval, relevance denotes how well a retrieved document or set of documents meets the information need of the user. Relevance may include concerns such as timeliness, authority or novelty of the result. The concern with the problem of finding relevant information dates back at least to the first publication of scientific journals in the 17th century. The formal study of relevance began in the 20th Century with the study of what would later be called bibliometrics.
Covariance and contravariance of vectorsIn physics, especially in multilinear algebra and tensor analysis, covariance and contravariance describe how the quantitative description of certain geometric or physical entities changes with a change of basis. In modern mathematical notation, the role is sometimes swapped. A simple illustrative case is that of a vector. For a vector, once a set of basis vectors has been defined, then the components of that vector will always vary opposite to that of the basis vectors. A vector is therefore a contravariant tensor.
Text miningText mining, text data mining (TDM) or text analytics is the process of deriving high-quality information from text. It involves "the discovery by computer of new, previously unknown information, by automatically extracting information from different written resources." Written resources may include websites, books, emails, reviews, and articles. High-quality information is typically obtained by devising patterns and trends by means such as statistical pattern learning. According to Hotho et al.
Weight (representation theory)In the mathematical field of representation theory, a weight of an algebra A over a field F is an algebra homomorphism from A to F, or equivalently, a one-dimensional representation of A over F. It is the algebra analogue of a multiplicative character of a group. The importance of the concept, however, stems from its application to representations of Lie algebras and hence also to representations of algebraic and Lie groups. In this context, a weight of a representation is a generalization of the notion of an eigenvalue, and the corresponding eigenspace is called a weight space.
Syntactic ambiguitySyntactic ambiguity, also called structural ambiguity, amphiboly or amphibology, is a situation where a sentence may be interpreted in more than one way due to ambiguous sentence structure. Syntactic ambiguity does not come from the range of meanings of single words, but from the relationship between the words and clauses of a sentence, and the sentence structure hidden behind the word order. In other words, a sentence is syntactically ambiguous when a reader or listener can reasonably interpret one sentence as having multiple possible structures.
Vector-valued functionA vector-valued function, also referred to as a vector function, is a mathematical function of one or more variables whose range is a set of multidimensional vectors or infinite-dimensional vectors. The input of a vector-valued function could be a scalar or a vector (that is, the dimension of the domain could be 1 or greater than 1); the dimension of the function's domain has no relation to the dimension of its range. A common example of a vector-valued function is one that depends on a single real parameter t, often representing time, producing a vector v(t) as the result.
Row and column vectorsIn linear algebra, a column vector with m elements is an matrix consisting of a single column of m entries, for example, Similarly, a row vector is a matrix for some n, consisting of a single row of n entries, (Throughout this article, boldface is used for both row and column vectors.) The transpose (indicated by T) of any row vector is a column vector, and the transpose of any column vector is a row vector: and The set of all row vectors with n entries in a given field (such as the real numbers) forms an n-dimensional vector space; similarly, the set of all column vectors with m entries forms an m-dimensional vector space.
Definitions of knowledgeDefinitions of knowledge try to determine the essential features of knowledge. Closely related terms are conception of knowledge, theory of knowledge, and analysis of knowledge. Some general features of knowledge are widely accepted among philosophers, for example, that it constitutes a cognitive success or an epistemic contact with reality and that propositional knowledge involves true belief. Most definitions of knowledge in analytic philosophy focus on propositional knowledge or knowledge-that, as in knowing that Dave is at home, in contrast to knowledge-how (know-how) expressing practical competence.