We propose a modified leaky-LMS filter that ensures stability of the estimates w(k) in the presence of bounded noise, without introducing any bias term and with the added cost of only a comparison and a multiplication per iteration when compared to the classical LMS algorithm. The new algorithm is further shown to converge for l/sub p/ noise and persistently exciting regressors. It also provides bounded estimates even in finite precision arithmetic. The stability and convergence properties of the new algorithm are established through a deterministic analysis that is based on the Lyapunov theory for the stability of nonlinear difference equations.