**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Optimal Control: NMPC

Description

This lecture covers Nonlinear Model Predictive Control (NMPC) with topics including setpoint stabilization, terminal constraints, different NMPC formulations, turnpikes, dissipativity, economic NMPC, and Pontryagin's Maximum Principle. The instructor discusses the principles of optimality, singular problems, and stability in NMPC schemes.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

EE-715: Optimal control

This doctoral course provides an introduction to optimal control covering fundamental theory, numerical implementation and problem formulation for applications.

Instructors (2)

Related concepts (95)

Related lectures (67)

Stochastic control

Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayesian probability-driven fashion, that random noise with known probability distribution affects the evolution and observation of the state variables. Stochastic control aims to design the time path of the controlled variables that performs the desired control task with minimum cost, somehow defined, despite the presence of this noise.

Model predictive control

Model predictive control (MPC) is an advanced method of process control that is used to control a process while satisfying a set of constraints. It has been in use in the process industries in chemical plants and oil refineries since the 1980s. In recent years it has also been used in power system balancing models and in power electronics. Model predictive controllers rely on dynamic models of the process, most often linear empirical models obtained by system identification.

Optimal control

Optimal control theory is a branch of mathematical optimization that deals with finding a control for a dynamical system over a period of time such that an objective function is optimized. It has numerous applications in science, engineering and operations research. For example, the dynamical system might be a spacecraft with controls corresponding to rocket thrusters, and the objective might be to reach the moon with minimum fuel expenditure.

Bang–bang control

In control theory, a bang–bang controller (hysteresis, 2 step or on–off controller), is a feedback controller that switches abruptly between two states. These controllers may be realized in terms of any element that provides hysteresis. They are often used to control a plant that accepts a binary input, for example a furnace that is either completely on or completely off. Most common residential thermostats are bang–bang controllers. The Heaviside step function in its discrete form is an example of a bang–bang control signal.

Control theory

Control theory is a field of control engineering and applied mathematics that deals with the control of dynamical systems in engineered processes and machines. The objective is to develop a model or algorithm governing the application of system inputs to drive the system to a desired state, while minimizing any delay, overshoot, or steady-state error and ensuring a level of control stability; often with the aim to achieve a degree of optimality. To do this, a controller with the requisite corrective behavior is required.

Nonlinear Model Predictive ControlEE-715: Optimal control

Explores Nonlinear Model Predictive Control, covering stability, optimality, pitfalls, and examples.

Optimal Control: OCPsEE-715: Optimal control

Covers Optimal Control Problems focusing on necessary conditions, existence of optimal controls, and numerical solutions.

Optimal Control Theory: BasicsEE-715: Optimal control

Covers the fundamentals of optimal control theory, focusing on defining OCPs, existence of solutions, performance criteria, physical constraints, and the principle of optimality.

Linear Quadratic (LQ) Optimal Control: Proof of TheoremME-422: Multivariable control

Covers the proof of the recursive formula for the optimal gains in LQ control over a finite horizon.

Optimal Control Theory: OCPsEE-715: Optimal control

Covers Optimal Control Theory, focusing on Optimal Control Problems (OCPs) and the calculus of variations.