**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Utility

Summary

As a topic of economics, utility is used to model worth or value. Its usage has evolved significantly over time. The term was introduced initially as a measure of pleasure or happiness as part of the theory of utilitarianism by moral philosophers such as Jeremy Bentham and John Stuart Mill. The term has been adapted and reapplied within neoclassical economics, which dominates modern economic theory, as a utility function that represents a consumer's ordinal preferences over a choice set, but is not necessarily comparable across consumers or possessing a cardinal interpretation. This concept of utility is personal and based on choice rather than on pleasure received, and so requires fewer behavioral assumptions than the original concept.
Utility function
Consider a set of alternatives among which a person has a preference ordering. A utility function represents that ordering if it is possible to assign a real number to each alternative in such a manner that alternative a is

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading

Related courses (32)

FIN-406: Macrofinance

This course provides students with a working knowledge of macroeconomic models that explicitly incorporate financial markets. The goal is to develop a broad and analytical framework for analyzing the interaction of financial decisions, macroeconomic events and policy decisions.

MGT-484: Applied probability & stochastic processes

This course focuses on dynamic models of random phenomena, and in particular, the most popular classes of such models: Markov chains and Markov decision processes. We will also study applications in queuing theory, finance, project management, etc.

FIN-609: Asset Pricing

This course provides an overview of the theory of asset pricing and portfolio choice theory following historical developments in the field and putting
emphasis on theoretical models that help our understanding of financial decision
making and financial markets.

Related people (2)

,

Related publications (54)

Loading

Loading

Loading

Related concepts (69)

Economics

Economics (ˌɛkəˈnɒmᵻks,_ˌiːkə-) is a social science that studies the production, distribution, and consumption of goods and services.
Economics focuses on the behaviour and interactions of econom

Marginal utility

In economics, utility refers to the satisfaction or benefit that consumers derive from consuming a product or service. Marginal utility, on the other hand, describes the change in pleasure or satisfa

Risk aversion

In economics and finance, risk aversion is the tendency of people to prefer outcomes with low uncertainty to those outcomes with high uncertainty, even if the average outcome of the latter is equal

Related units (2)

Related lectures (113)

Dynamic Programming Optimization (DPOP) is an algorithm proposed to solve distributed constraint optimization problems. In order to represent the multi-values functions manipulated in this algorithm, a data structure called Hypercube was implemented. A more efficient data structure, the Utility Diagram, was then proposed as an alternative to the Hypercube. DPOP also required the implementation of several operations (such as join, project, slice, split and reorder) on these two data structures. This project is a follow-up of Nacereddine Ouaret’s master thesis, which consisted in implementing all of these data structures and theirs associated operations. As DPOP may have to work on very large decision diagrams, and perform a lot of successive operations on them, having implementations which are efficient in term of speed and memory is critical. The aim of this project was therefore to seek for new ways to improve the already implemented functions for hypercubes and utility diagrams, both in term of execution time and memory consumption. This report will thus first present a quick overview of hypercubes, utility diagrams, and their associated operations (a more complete description of these objects, as well as the details about their original implementation, can be found in Nacereddine Ouaret’s report on Efficient Data Structures for Decision Diagrams). The second part will then cover various improvements made to their implementation during the course of this project. Finally, a variant for the methods used by hypercube, more economical in term of memory as it reuses existing hypercubes rather than creating new ones, will be presented.

2009Michael Ingram, Mario Paolone, Panagiotis Papadopoulos

Nowadays, transmission system operators require higher degree of observability in real-time to gain situational awareness and improve the decision-making process to guarantee a safe and reliable operation. Digitalization of energy systems allows utilities to monitor the system dynamic performance in real-time at fast time scales. The use of such technologies has unlocked new opportunities to introduce new data driven algorithms for improving the stability assessment and control of the system. Motivated by these challenges, a group of experts have worked together to highlight and establish a baseline set of these common concerns, which can be used as motivation to propose innovative analytics and data-driven solutions. In this document, the results of a survey on 10 transmission system operators around the world are presented and it aims to understand the current practices of the participating companies, in terms of data acquisition, handling, storage, modelling and analytics. The overall objective of this document is to capture the actual needs from the interviewed utilities, thereby laying the groundwork for setting valid assumptions for the development of advanced algorithms in this field.

, ,

We study supervised learning problems for predicting properties of individuals who belong to one of two demographic groups, and we seek predictors that are fair according to statistical parity. This means that the distributions of the predictions within the two groups should be close with respect to the Kolmogorov distance, and fairness is achieved by penalizing the dissimilarity of these two distributions in the objective function of the learning problem. In this paper, we showcase conceptual and computational benefits of measuring unfairness with integral probability metrics (IPMs) other than the Kolmogorov distance. Conceptually, we show that the generator of any IPM can be interpreted as a family of utility functions and that unfairness with respect to this IPM arises if individuals in the two demographic groups have diverging expected utilities. We also prove that the unfairness-regularized prediction loss admits unbiased gradient estimators if unfairness is measured by the squared L2-distance or by a squared maximum mean discrepancy. In this case the fair learning problem is susceptible to efficient stochastic gradient descent (SGD) algorithms. Numerical experiments on real data show that these SGD algorithms outperform state-of-the-art methods for fair learning in that they achieve superior accuracy-unfairness trade-offs—sometimes orders of magnitude faster. Finally, we identify conditions under which statistical parity can improve prediction accuracy.

2022