Agent (economics)In economics, an agent is an actor (more specifically, a decision maker) in a model of some aspect of the economy. Typically, every agent makes decisions by solving a well- or ill-defined optimization or choice problem. For example, buyers (consumers) and sellers (producers) are two common types of agents in partial equilibrium models of a single market. Macroeconomic models, especially dynamic stochastic general equilibrium models that are explicitly based on microfoundations, often distinguish households, firms, and governments or central banks as the main types of agents in the economy.
Herd behaviorHerd behavior is the behavior of individuals in a group acting collectively without centralized direction. Herd behavior occurs in animals in herds, packs, bird flocks, fish schools and so on, as well as in humans. Voting, demonstrations, riots, general strikes, sporting events, religious gatherings, everyday decision-making, judgement and opinion-forming, are all forms of human-based herd behavior. Raafat, Chater and Frith proposed an integrated approach to herding, describing two key issues, the mechanisms of transmission of thoughts or behavior between individuals and the patterns of connections between them.
Phase ruleIn thermodynamics, the phase rule is a general principle governing "pVT" systems, whose thermodynamic states are completely described by the variables pressure (p), volume (V) and temperature (T), in thermodynamic equilibrium. If F is the number of degrees of freedom, C is the number of components and P is the number of phases, then It was derived by American physicist Josiah Willard Gibbs in his landmark paper titled On the Equilibrium of Heterogeneous Substances, published in parts between 1875 and 1878.
Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
HerdA herd is a social group of certain animals of the same species, either wild or domestic. The form of collective animal behavior associated with this is called herding. These animals are known as gregarious animals. The term herd is generally applied to mammals, and most particularly to the grazing ungulates that classically display this behaviour. Different terms are used for similar groupings in other species; in the case of birds, for example, the word is flocking, but flock may also be used for mammals, particularly sheep or goats.
Social choice theorySocial choice theory or social choice is a theoretical framework for analysis of combining individual opinions, preferences, interests, or welfares to reach a collective decision or social welfare in some sense. Whereas choice theory is concerned with individuals making choices based on their preferences, social choice theory is concerned with how to translate the preferences of individuals into the preferences of a group. A non-theoretical example of a collective decision is enacting a law or set of laws under a constitution.
Information economicsInformation economics or the economics of information is the branch of microeconomics that studies how information and information systems affect an economy and economic decisions. One application considers information embodied in certain types of commodities that are "expensive to produce but cheap to reproduce." Examples include computer software (e.g., Microsoft Windows), pharmaceuticals, and technical books. Once information is recorded "on paper, in a computer, or on a compact disc, it can be reproduced and used by a second person essentially for free.
Deep learningDeep learning is part of a broader family of machine learning methods, which is based on artificial neural networks with representation learning. The adjective "deep" in deep learning refers to the use of multiple layers in the network. Methods used can be either supervised, semi-supervised or unsupervised.
Google DeepMindDeepMind Technologies Limited, doing business as Google DeepMind, is a British-American artificial intelligence research laboratory which serves as a subsidiary of Google. Founded in the UK in 2010, it was acquired by Google in 2014, becoming a wholly owned subsidiary of Google parent company Alphabet Inc. after Google's corporate restructuring in 2015. The company is based in London, with research centres in Canada, France, and the United States.
Fisher informationIn mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information. The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth).