Neoclassical economicsNeoclassical economics is an approach to economics in which the production, consumption, and valuation (pricing) of goods and services are observed as driven by the supply and demand model. According to this line of thought, the value of a good or service is determined through a hypothetical maximization of utility by income-constrained individuals and of profits by firms facing production costs and employing available information and factors of production.
Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Information economicsInformation economics or the economics of information is the branch of microeconomics that studies how information and information systems affect an economy and economic decisions. One application considers information embodied in certain types of commodities that are "expensive to produce but cheap to reproduce." Examples include computer software (e.g., Microsoft Windows), pharmaceuticals, and technical books. Once information is recorded "on paper, in a computer, or on a compact disc, it can be reproduced and used by a second person essentially for free.
Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Game studiesGame studies, also known as ludology (from ludus, "game", and -logia, "study", "research"), is the study of games, the act of playing them, and the players and cultures surrounding them. It is a field of cultural studies that deals with all types of games throughout history. This field of research utilizes the tactics of, at least, folkloristics and cultural heritage, sociology and psychology, while examining aspects of the design of the game, the players in the game, and the role the game plays in its society or culture.
Convolutional neural networkConvolutional neural network (CNN) is a regularized type of feed-forward neural network that learns feature engineering by itself via filters (or kernel) optimization. Vanishing gradients and exploding gradients, seen during backpropagation in earlier neural networks, are prevented by using regularized weights over fewer connections. For example, for each neuron in the fully-connected layer 10,000 weights would be required for processing an image sized 100 × 100 pixels.
Dependence logicDependence logic is a logical formalism, created by Jouko Väänänen, which adds dependence atoms to the language of first-order logic. A dependence atom is an expression of the form , where are terms, and corresponds to the statement that the value of is functionally dependent on the values of . Dependence logic is a logic of imperfect information, like branching quantifier logic or independence-friendly logic (IF logic): in other words, its game-theoretic semantics can be obtained from that of first-order logic by restricting the availability of information to the players, thus allowing for non-linearly ordered patterns of dependence and independence between variables.
Deep reinforcement learningDeep reinforcement learning (deep RL) is a subfield of machine learning that combines reinforcement learning (RL) and deep learning. RL considers the problem of a computational agent learning to make decisions by trial and error. Deep RL incorporates deep learning into the solution, allowing agents to make decisions from unstructured input data without manual engineering of the state space. Deep RL algorithms are able to take in very large inputs (e.g.
Nordic modelThe Nordic model comprises the economic and social policies as well as typical cultural practices common in the Nordic countries (Denmark, Finland, Iceland, Norway, and Sweden). This includes a comprehensive welfare state and multi-level collective bargaining based on the economic foundations of social corporatism, and a commitment to private ownership within a market-based mixed economy — with Norway being a partial exception due to a large number of state-owned enterprises and state ownership in publicly listed firms.
Independence-friendly logicIndependence-friendly logic (IF logic; proposed by Jaakko Hintikka and Gabriel Sandu in 1989) is an extension of classical first-order logic (FOL) by means of slashed quantifiers of the form and , where is a finite set of variables. The intended reading of is "there is a which is functionally independent from the variables in ". IF logic allows one to express more general patterns of dependence between variables than those which are implicit in first-order logic.