**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Performance limits of LMS-based adaptive networks

Abstract

In this work we analyze the mean-square performance of different strategies for adaptation over two-node least-mean-squares (LMS) networks. The results highlight some interesting properties for adaptive networks in comparison to centralized solutions. The analysis reveals that the adapt-then-combine (ATC) adaptive network algorithm can achieve lower excess-mean-square-error (EMSE) than a centralized solution that is based on either block or incremental LMS strategies with the same convergence rate.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications (34)

Related concepts (20)

Mean squared error

In statistics, the mean squared error (MSE) or mean squared deviation (MSD) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors—that is, the average squared difference between the estimated values and the actual value. MSE is a risk function, corresponding to the expected value of the squared error loss. The fact that MSE is almost always strictly positive (and not zero) is because of randomness or because the estimator does not account for information that could produce a more accurate estimate.

Root-mean-square deviation

The root-mean-square deviation (RMSD) or root-mean-square error (RMSE) is a frequently used measure of the differences between values (sample or population values) predicted by a model or an estimator and the values observed. The RMSD represents the square root of the second sample moment of the differences between predicted values and observed values or the quadratic mean of these differences. These deviations are called residuals when the calculations are performed over the data sample that was used for estimation and are called errors (or prediction errors) when computed out-of-sample.

Mean squared prediction error

In statistics the mean squared prediction error (MSPE), also known as mean squared error of the predictions, of a smoothing, curve fitting, or regression procedure is the expected value of the squared prediction errors (PE), the square difference between the fitted values implied by the predictive function and the values of the (unobservable) true value g. It is an inverse measure of the explanatory power of and can be used in the process of cross-validation of an estimated model.

Purpose: To develop a scan-specific model that estimates and corrects k-space errors made when reconstructing accelerated MRI data. Methods: Scan-specific artifact reduction in k-space (SPARK) trains a convolutional-neural-network to estimate and correct k ...

Michaël Unser, Pakshal Narendra Bohra

We present a statistical framework to benchmark the performance of reconstruction algorithms for linear inverse problems, in particular, neural-network-based methods that require large quantities of training data. We generate synthetic signals as realizati ...

Nicolas Henri Bernard Flammarion, Etienne Patrice Boursier, Loucas Pillaud-Vivien

The training of neural networks by gradient descent methods is a cornerstone of the deep learning revolution. Yet, despite some recent progress, a complete theory explaining its success is still missing. This article presents, for orthogonal input vectors, ...

2022