Publication

Diffusion LMS Strategies for Distributed Estimation

Ali H. Sayed
2010
Journal paper
Abstract

We consider the problem of distributed estimation, where a set of nodes is required to collectively estimate some parameter of interest from noisy measurements. The problem is useful in several contexts including wireless and sensor networks, where scalability, robustness, and low power consumption are desirable features. Diffusion cooperation schemes have been shown to provide good performance, robustness to node and link failure, and are amenable to distributed implementations. In this work we focus on diffusion-based adaptive solutions of the LMS type. We motivate and propose new versions of the diffusion LMS algorithm that outperform previous solutions. We provide performance and convergence analysis of the proposed algorithms, together with simulation results comparing with existing techniques. We also discuss optimization schemes to design the diffusion LMS weights.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (31)
Maximum likelihood estimation
In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.
Estimation theory
Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
Genetic algorithm
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, causal inference, etc.
Show more
Related publications (41)

Statistical Emulation of Neural Simulators: Application to Neocortical L2/3 Large Basket Cells

Werner Alfons Hilda Van Geit, Oren Amsalem, Idan Segev

Many scientific systems are studied using computer codes that simulate the phenomena of interest. Computer simulation enables scientists to study a broad range of possible conditions, generating large quantities of data at a faster rate than the laboratory ...
FRONTIERS MEDIA SA2022

A Modular Workflow for Model Building, Analysis, and Parameter Estimation in Systems Biology and Neuroscience

Daniel Keller, Andrii Stepaniuk

Neuroscience incorporates knowledge from a range of scales, from single molecules to brain wide neural networks. Modeling is a valuable tool in understanding processes at a single scale or the interactions between two adjacent scales and researchers use a ...
HUMANA PRESS INC2021

Learning Hawkes Processes from a Handful of Events

Patrick Thiran, Matthias Grossglauser, William Trouleau, Farnood Salehi

Learning the causal-interaction network of multivariate Hawkes processes is a useful task in many applications. Maximum-likelihood estimation is the most common approach to solve the problem in the presence of long observation sequences. However, when only ...
2019
Show more
Related MOOCs (18)
Digital Signal Processing I
Basic signal processing concepts, Fourier analysis and filters. This module can be used as a starting point or a basic refresher in elementary DSP
Digital Signal Processing II
Adaptive signal processing, A/D and D/A. This module provides the basic tools for adaptive filtering and a solid mathematical framework for sampling and quantization
Digital Signal Processing III
Advanced topics: this module covers real-time audio processing (with examples on a hardware board), image processing and communication system design.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.