We show that the celebrated least-mean squares (LMS) adaptive algorithm is H/sup /spl infin// optimal. The LMS algorithm has been long regarded as an approximate solution to either a stochastic or a deterministic least-squares problem, and it essentially amounts to updating the weight vector estimates along the direction of the instantaneous gradient of a quadratic cost function. We show that the LMS can be regarded as the exact solution to a minimization problem in its own right. Namely, we establish that it is a minimax filter: it minimizes the maximum energy gain from the disturbances to the predicted errors, whereas the closely related so-called normalized LMS algorithm minimizes the maximum energy gain from the disturbances to the filtered errors. Moreover, since these algorithms are central H/sup /spl infin// filters, they minimize a certain exponential cost function and are thus also risk-sensitive optimal. We discuss the various implications of these results and show how they provide theoretical justification for the widely observed excellent robustness properties of the LMS filter.
Emanuele Mingione, Diego Alberici
Ali H. Sayed, Bicheng Ying, Kun Yuan, Sulaiman A S A E Alghunaim