Feature selectionFeature selection is the process of selecting a subset of relevant features (variables, predictors) for use in model construction. Stylometry and DNA microarray analysis are two cases where feature selection is used. It should be distinguished from feature extraction. Feature selection techniques are used for several reasons: simplification of models to make them easier to interpret by researchers/users, shorter training times, to avoid the curse of dimensionality, improve data's compatibility with a learning model class, encode inherent symmetries present in the input space.
Feature (computer vision)In computer vision and , a feature is a piece of information about the content of an image; typically about whether a certain region of the image has certain properties. Features may be specific structures in the image such as points, edges or objects. Features may also be the result of a general neighborhood operation or feature detection applied to the image. Other examples of features are related to motion in image sequences, or to shapes defined in terms of curves or boundaries between different image regions.
FalsifiabilityFalsifiability is a deductive standard of evaluation of scientific theories and hypotheses, introduced by the philosopher of science Karl Popper in his book The Logic of Scientific Discovery (1934). A theory or hypothesis is falsifiable (or refutable) if it can be logically contradicted by an empirical test. Popper proposed falsifiability as the cornerstone solution to both the problem of induction and the problem of demarcation.
Turing testThe Turing test, originally called the imitation game by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation was a machine, and all participants would be separated from one another.
DeontologyIn moral philosophy, deontological ethics or deontology (from Greek: δέον + λόγος) is the normative ethical theory that the morality of an action should be based on whether that action itself is right or wrong under a series of rules and principles, rather than based on the consequences of the action. It is sometimes described as duty-, obligation-, or rule-based ethics. Deontological ethics is commonly contrasted to consequentialism, utilitarianism, virtue ethics, and pragmatic ethics.
Normative ethicsNormative ethics is the study of ethical behaviour and is the branch of philosophical ethics that investigates the questions that arise regarding how one ought to act, in a moral sense. Normative ethics is distinct from meta-ethics in that the former examines standards for the rightness and wrongness of actions, whereas the latter studies the meaning of moral language and the metaphysics of moral facts. Likewise, normative ethics is distinct from applied ethics in that the former is more concerned with 'who ought one be' rather than the ethics of a specific issue (e.
Occam's razorIn philosophy, Occam's razor (also spelled Ockham's razor or Ocham's razor; novacula Occami) is the problem-solving principle that recommends searching for explanations constructed with the smallest possible set of elements. It is also known as the principle of parsimony or the law of parsimony (lex parsimoniae). Attributed to William of Ockham, a 14th-century English philosopher and theologian, it is frequently cited as Entia non sunt multiplicanda praeter necessitatem, which translates as "Entities must not be multiplied beyond necessity", although Occam never used these exact words.