**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Category# Experimental economics

Summary

Experimental economics is the application of experimental methods to study economic questions. Data collected in experiments are used to estimate effect size, test the validity of economic theories, and illuminate market mechanisms. Economic experiments usually use cash to motivate subjects, in order to mimic real-world incentives. Experiments are used to help understand how and why markets and other exchange systems function as they do. Experimental economics have also expanded to understand institutions and the law (experimental law and economics).
A fundamental aspect of the subject is design of experiments. Experiments may be conducted in the field or in laboratory settings, whether of individual or group behavior.
Variants of the subject outside such formal confines include natural and quasi-natural experiments.
One can loosely classify economic experiments using the following topics:
Markets
Games
Evolutionary game theory
Decision making
Bargaining
Contracts
Auctions
Coordination
Social Preferences
Learning
Matching
Field Experiments
Within economics education, one application involves experiments used in the teaching of economics. An alternative approach with experimental dimensions is agent-based computational modeling. It is important to consider the potential and constraints of games for understanding rational behavior and solving human conflict.
Coordination games are games with multiple pure strategy Nash equilibria. There are two general sets of questions that experimental economists typically ask when examining such games: (1) Can laboratory subjects coordinate, or learn to coordinate, on one of multiple equilibria, and if so are there general principles that can help predict which equilibrium is likely to be chosen? (2) Can laboratory subjects coordinate, or learn to coordinate, on the Pareto best equilibrium and if not, are there conditions or mechanisms which would help subjects coordinate on the Pareto best equilibrium? Deductive selection principles are those that allow predictions based on the properties of the game alone.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related people (1)

Related concepts (3)

Related publications (6)

Related courses (2)

Related lectures (4)

Related categories (9)

Endogeneity (econometrics)

In econometrics, endogeneity broadly refers to situations in which an explanatory variable is correlated with the error term. The distinction between endogenous and exogenous variables originated in simultaneous equations models, where one separates variables whose values are determined by the model from variables which are predetermined; ignoring simultaneity in the estimation leads to biased estimates as it violates the exogeneity assumption of the Gauss–Markov theorem.

Simultaneous equations model

Simultaneous equations models are a type of statistical model in which the dependent variables are functions of other dependent variables, rather than just independent variables. This means some of the explanatory variables are jointly determined with the dependent variable, which in economics usually is the consequence of some underlying equilibrium mechanism. Take the typical supply and demand model: whilst typically one would determine the quantity supplied and demanded to be a function of the price set by the market, it is also possible for the reverse to be true, where producers observe the quantity that consumers demand and then set the price.

Parameter identification problem

In economics and econometrics, the parameter identification problem arises when the value of one or more parameters in an economic model cannot be determined from observable variables. It is closely related to non-identifiability in statistics and econometrics, which occurs when a statistical model has more than one set of parameters that generate the same distribution of observations, meaning that multiple parameterizations are observationally equivalent.

ME-454: Modelling and optimization of energy systems

The goal of the lecture is to present and apply techniques for the modelling and the thermo-economic optimisation of industrial process and energy systems. The lecture covers the problem statement, th

MGT-581: Introduction to econometrics

The course provides an introduction to econometrics. The objective is to learn how to make valid (i.e., causal) inference from economic and social data. It explains the main estimators and present met

Generalized Method of Moments (GMM)FIN-403: Econometrics

Introduces the Generalized Method of Moments (GMM), a versatile approach for estimation based on moment restrictions, with applications in asset pricing models.

Instrumental Variables: Part 1FIN-403: Econometrics

Introduces instrumental variables to address endogeneity issues, using examples to illustrate practical applications and testing requirements.

Generalized Method of Moments (GMM)FIN-403: Econometrics

Introduces the Generalized Method of Moments (GMM) in econometrics, focusing on its application in instrumental variable estimation and asset pricing models.

Regression analysis

In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one or more independent variables (often called 'predictors', 'covariates', 'explanatory variables' or 'features'). The most common form of regression analysis is linear regression, in which one finds the line (or a more complex linear combination) that most closely fits the data according to a specific mathematical criterion.

Time series

In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Examples of time series are heights of ocean tides, counts of sunspots, and the daily closing value of the Dow Jones Industrial Average. A time series is very frequently plotted via a run chart (which is a temporal line chart).

Data analysis

Data analysis is the process of inspecting, cleansing, transforming, and modeling data with the goal of discovering useful information, informing conclusions, and supporting decision-making. Data analysis has multiple facets and approaches, encompassing diverse techniques under a variety of names, and is used in different business, science, and social science domains. In today's business world, data analysis plays a role in making decisions more scientific and helping businesses operate more effectively.

Identification of kinetic models is an important task for monitoring, control and optimization of industrial processes. Robust kinetic models are often based on first principles, which describe the evolution of states – number of moles, temperature and vol ...

2013Dominique Bonvin, Julien Léo Billeter, Sriniketh Srinivasan

Identification of kinetic models is an important task for monitoring, control and optimization of industrial processes. Kinetic models are often based on first principles, which describe the evolution of the states – numbers of moles, temperature and volum ...

2013Dominique Bonvin, Julien Léo Billeter, Sriniketh Srinivasan

Kinetic models contribute greatly to cost reduction during the process development phase and are also helpful for process monitoring and control purposes. Kinetic models describe the underlying reactions, mass transport and operating conditions of the reac ...

2013