Algorithmic probabilityIn algorithmic information theory, algorithmic probability, also known as Solomonoff probability, is a mathematical method of assigning a prior probability to a given observation. It was invented by Ray Solomonoff in the 1960s. It is used in inductive inference theory and analyses of algorithms. In his general theory of inductive inference, Solomonoff uses the method together with Bayes' rule to obtain probabilities of prediction for an algorithm's future outputs.
Algorithmic information theoryAlgorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" (except for a constant that only depends on the chosen universal programming language) the relations or inequalities found in information theory.
InquiryAn inquiry (also spelled as enquiry in British English) is any process that has the aim of augmenting knowledge, resolving doubt, or solving a problem. A theory of inquiry is an account of the various types of inquiry and a treatment of the ways that each type of inquiry achieves its aim. When three terms are so related to one another that the last is wholly contained in the middle and the middle is wholly contained in or excluded from the first, the extremes must admit of perfect syllogism.
Logical reasoningLogical reasoning is a mental activity that aims to arrive at a conclusion in a rigorous way. It happens in the form of inferences or arguments by starting from a set of premises and reasoning to a conclusion supported by these premises. The premises and the conclusion are propositions, i.e. true or false claims about what is the case. Together, they form an argument. Logical reasoning is norm-governed in the sense that it aims to formulate correct arguments that any rational person would find convincing.
Early modern philosophyEarly modern philosophy (also classical modern philosophy) is a period in the history of philosophy that overlaps with the beginning of the period known as modern philosophy. The early modern era of philosophy was a progressive movement of Western thought, exploring through theories and discourse such topics as mind and matter, the supernatural, and civil life. It succeeded in the medieval era of philosophy. Early modern philosophy is usually thought to have occurred between the 16th and 18th centuries, though some philosophers and historians may put this period slightly earlier.
InductivismInductivism is the traditional and still commonplace philosophy of scientific method to develop scientific theories. Inductivism aims to neutrally observe a domain, infer laws from examined cases—hence, inductive reasoning—and thus objectively discover the sole naturally true theory of the observed. Inductivism's basis is, in sum, "the idea that theories can be derived from, or established on the basis of, facts".
Categorical propositionIn logic, a categorical proposition, or categorical statement, is a proposition that asserts or denies that all or some of the members of one category (the subject term) are included in another (the predicate term). The study of arguments using categorical statements (i.e., syllogisms) forms an important branch of deductive reasoning that began with the Ancient Greeks. The Ancient Greeks such as Aristotle identified four primary distinct types of categorical proposition and gave them standard forms (now often called A, E, I, and O).
Critical rationalismCritical rationalism is an epistemological philosophy advanced by Karl Popper on the basis that, if a statement cannot be logically deduced (from what is known), it might nevertheless be possible to logically falsify it. Following Hume, Popper rejected any inductive logic that is ampliative, i.e., any logic that can provide more knowledge than deductive logic. In other words if we cannot assert it logically, we should at the least try to logically falsify it, which led Popper to his falsifiability criterion.
Base rate fallacyThe base rate fallacy, also called base rate neglect or base rate bias, is a type of fallacy in which people tend to ignore the base rate (e.g., general prevalence) in favor of the individuating information (i.e., information pertaining only to a specific case). Base rate neglect is a specific form of the more general extension neglect. It is also called prosecutor's fallacy or defense attorney's fallacy when applied to the results of statistical tests (such as DNA tests) in the context of law proceedings.
Grammar inductionGrammar induction (or grammatical inference) is the process in machine learning of learning a formal grammar (usually as a collection of re-write rules or productions or alternatively as a finite state machine or automaton of some kind) from a set of observations, thus constructing a model which accounts for the characteristics of the observed objects. More generally, grammatical inference is that branch of machine learning where the instance space consists of discrete combinatorial objects such as strings, trees and graphs.