Publication

NSM Converges to a k-NN Regressor Under Loose Lipschitz Estimates

Abstract

Although it is known that having accurate Lipschitz estimates is essential for certain models to deliver good predictive performance, refining this constant in practice can be a difficult task especially when the input dimension is high. In this letter, we shed light on the consequences of employing loose Lipschitz bounds in the Nonlinear Set Membership (NSM) framework, showing that the model converges to a nearest neighbor regressor (k-NN with k = 1). This convergence process is moreover not uniform, and is monotonic in the univariate case. An intuitive geometrical interpretation of the result is then given and its practical implications are discussed.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.