Concept

Hamiltonian (control theory)

Summary
The Hamiltonian is a function used to solve a problem of optimal control for a dynamical system. It can be understood as an instantaneous increment of the Lagrangian expression of the problem that is to be optimized over a certain time period. Inspired by—but distinct from—the Hamiltonian of classical mechanics, the Hamiltonian of optimal control theory was developed by Lev Pontryagin as part of his maximum principle. Pontryagin proved that a necessary condition for solving the optimal control problem is that the control should be chosen so as to optimize the Hamiltonian. Problem statement and definition of the Hamiltonian Consider a dynamical system of n first-order differential equations :\dot{\mathbf{x}}(t) = \mathbf{f}(\mathbf{x}(t),\mathbf{u}(t),t) where \mathbf{x}(t) = \left[ x_{1}(t), x_{2}(t), \ldots, x_{n}(t) \right]^{\mathsf{T}} denotes a vector of state variables, and \mathbf{u}(t) = \left[ u_{1}(t), u_{2}(t), \ldots,
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading