Costate equationThe costate equation is related to the state equation used in optimal control. It is also referred to as auxiliary, adjoint, influence, or multiplier equation. It is stated as a vector of first order differential equations where the right-hand side is the vector of partial derivatives of the negative of the Hamiltonian with respect to the state variables. The costate variables can be interpreted as Lagrange multipliers associated with the state equations.
Hessian matrixIn mathematics, the Hessian matrix, Hessian or (less commonly) Hesse matrix is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. The Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named after him. Hesse originally used the term "functional determinants".
Constrained optimizationIn mathematical optimization, constrained optimization (in some contexts called constraint optimization) is the process of optimizing an objective function with respect to some variables in the presence of constraints on those variables. The objective function is either a cost function or energy function, which is to be minimized, or a reward function or utility function, which is to be maximized.
Nonlinear programmingIn mathematics, nonlinear programming (NLP) is the process of solving an optimization problem where some of the constraints or the objective function are nonlinear. An optimization problem is one of calculation of the extrema (maxima, minima or stationary points) of an objective function over a set of unknown real variables and conditional to the satisfaction of a system of equalities and inequalities, collectively termed constraints. It is the sub-field of mathematical optimization that deals with problems that are not linear.
Derivative testIn calculus, a derivative test uses the derivatives of a function to locate the critical points of a function and determine whether each point is a local maximum, a local minimum, or a saddle point. Derivative tests can also give information about the concavity of a function. The usefulness of derivatives to find extrema is proved mathematically by Fermat's theorem of stationary points. The first-derivative test examines a function's monotonic properties (where the function is increasing or decreasing), focusing on a particular point in its domain.
Critical point (mathematics)Critical point is a wide term used in many branches of mathematics. When dealing with functions of a real variable, a critical point is a point in the domain of the function where the function is either not differentiable or the derivative is equal to zero. When dealing with complex variables, a critical point is, similarly, a point in the function's domain where it is either not holomorphic or the derivative is equal to zero. Likewise, for a function of several real variables, a critical point is a value in its domain where the gradient is undefined or is equal to zero.
Feasible regionIn mathematical optimization, a feasible region, feasible set, search space, or solution space is the set of all possible points (sets of values of the choice variables) of an optimization problem that satisfy the problem's constraints, potentially including inequalities, equalities, and integer constraints. This is the initial set of candidate solutions to the problem, before the set of candidates has been narrowed down.