Bayesian probabilityBayesian probability (ˈbeɪziən or ˈbeɪʒən ) is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief. The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses; that is, with propositions whose truth or falsity is unknown.
Knowledge-based systemsA knowledge-based system (KBS) is a computer program that reasons and uses a knowledge base to solve complex problems. The term is broad and refers to many different kinds of systems. The one common theme that unites all knowledge based systems is an attempt to represent knowledge explicitly and a reasoning system that allows it to derive new knowledge. Thus, a knowledge-based system has two distinguishing features: a knowledge base and an inference engine.
Intuitive statisticsIntuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions.
Grammar inductionGrammar induction (or grammatical inference) is the process in machine learning of learning a formal grammar (usually as a collection of re-write rules or productions or alternatively as a finite state machine or automaton of some kind) from a set of observations, thus constructing a model which accounts for the characteristics of the observed objects. More generally, grammatical inference is that branch of machine learning where the instance space consists of discrete combinatorial objects such as strings, trees and graphs.
Educational neuroscienceEducational neuroscience (or neuroeducation, a component of Mind Brain and Education) is an emerging scientific field that brings together researchers in cognitive neuroscience, developmental cognitive neuroscience, educational psychology, educational technology, education theory and other related disciplines to explore the interactions between biological processes and education. Researchers in educational neuroscience investigate the neural mechanisms of reading, numerical cognition, attention and their attendant difficulties including dyslexia, dyscalculia and ADHD as they relate to education.
Probabilité algorithmiqueEn théorie algorithmique de l'information, la probabilité algorithmique, aussi connue comme probabilité de Solomonoff, est une méthode permettant d’assigner une probabilité à une observation donnée. Il a été inventé par Ray Solomonoff dans les années 1960. Elle est utilisée dans la théorie de l'inférence inductive et dans l'analyse des algorithmes. En particulier, dans sa thèorie de l'induction, Solomonoff utilise une telle formulation pour exprimer la probabilité a priori dans la formule de Bayes.
Récurrence transfinieEn mathématiques, on parle de récurrence transfinie ou de récursion transfinie pour deux principes reliés mais distincts. Les définitions par récursion transfinie — permettent de construire des objets infinis, et généralisent les définitions de suite par récurrence sur l'ensemble N des entiers naturels en considérant des familles indexées par un ordinal infini quelconque, au lieu de se borner au plus petit d'entre eux qu'est N, appelé ω en tant que nombre ordinal.