**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Stochastic control

Summary

Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayesian probability-driven fashion, that random noise with known probability distribution affects the evolution and observation of the state variables. Stochastic control aims to design the time path of the controlled variables that performs the desired control task with minimum cost, somehow defined, despite the presence of this noise. The context may be either discrete time or continuous time.
An extremely well-studied formulation in stochastic control is that of linear quadratic Gaussian control. Here the model is linear, the objective function is the expected value of a quadratic form, and the disturbances are purely additive. A basic result for discrete-time centralized systems with only additive uncertainty is the certainty equivalence property: that the optimal control solution in this case is the same as would be obtained in the absence of the additive disturbances. This property is applicable to all centralized systems with linear equations of evolution, quadratic cost function, and noise entering the model only additively; the quadratic assumption allows for the optimal control laws, which follow the certainty-equivalence property, to be linear functions of the observations of the controllers.
Any deviation from the above assumptions—a nonlinear state equation, a non-quadratic objective function, noise in the multiplicative parameters of the model, or decentralization of control—causes the certainty equivalence property not to hold. For example, its failure to hold for decentralized control was demonstrated in Witsenhausen's counterexample.
In a discrete-time context, the decision-maker observes the state variable, possibly with observational noise, in each time period.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications (5)

Related lectures (103)

Related people (2)

Related courses (16)

Related concepts (2)

Related MOOCs (2)

ME-422: Multivariable control

This course covers methods for the analysis and control of systems with multiple inputs and outputs, which are ubiquitous in modern technology and industry. Special emphasis will be given to discrete-

ME-324: Discrete-time control of dynamical systems

On introduit les bases de l'automatique linéaire discrète qui consiste à appliquer une commande sur des intervalles uniformément espacés. La cadence de l'échantillonnage qui est associée joue un rôle

FIN-615: Dynamic Asset Pricing

This course provides an advanced introduction to the methods and results of continuous time asset pricing

Optimal Control: NMPC

Covers Nonlinear Model Predictive Control (NMPC) principles, including setpoint stabilization and Pontryagin's Maximum Principle.

Model predictive control with multi-region MFDs

Covers model predictive control for multi-region Macroscopic Fundamental Diagrams in traffic flow modeling and its application in handling non-linear control problems.

Planetary Boundaries: Novel Entities

Delves into planetary boundaries, emphasizing novel entities' impact and the need for precautionary control variables.

Stochastic control

Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayesian probability-driven fashion, that random noise with known probability distribution affects the evolution and observation of the state variables. Stochastic control aims to design the time path of the controlled variables that performs the desired control task with minimum cost, somehow defined, despite the presence of this noise.

Control theory

Control theory is a field of control engineering and applied mathematics that deals with the control of dynamical systems in engineered processes and machines. The objective is to develop a model or algorithm governing the application of system inputs to drive the system to a desired state, while minimizing any delay, overshoot, or steady-state error and ensuring a level of control stability; often with the aim to achieve a degree of optimality. To do this, a controller with the requisite corrective behavior is required.

Intro to Traffic Flow Modeling and Intelligent Transport Systems

Learn how to describe, model and control urban traffic congestion in simple ways and gain insight into advanced traffic management schemes that improve mobility in cities and highways.

Intro to Traffic Flow Modeling and Intelligent Transport Systems

Learn how to describe, model and control urban traffic congestion in simple ways and gain insight into advanced traffic management schemes that improve mobility in cities and highways.

Robert Dalang, Laura Vinckenbosch

We solve two stochastic control problems in which a player tries to minimize or maximize the exit time from an interval of a Brownian particle, by controlling its drift. The player can change from one

Alexandre Massoud Alahi, Mohammadhossein Bahari, Ismail Nejjar

Vehicle trajectory prediction tasks have been commonly tackled from two distinct perspectives: either with knowledge-driven methods or more recently with data-driven ones. On the one hand, we can expl

2021The topic of this thesis is the study of several stochastic control problems motivated by sailing races. The goal is to minimize the travel time between two locations, by selecting the fastest route i