Publication

Regularization for distributionally robust state estimation and prediction

Abstract

Simulation script for the paper "Regularization for distributionally robust state estimation and prediction". Run tests/test_cdc.py to reproduce results. Extended versions can be found at https://github.com/DecodEPFL/.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (8)
Robust statistics
Robust statistics are statistics with good performance for data drawn from a wide range of probability distributions, especially for distributions that are not normal. Robust statistical methods have been developed for many common problems, such as estimating location, scale, and regression parameters. One motivation is to produce statistical methods that are not unduly affected by outliers. Another motivation is to provide methods with good performance when there are small departures from a parametric distribution.
Maximum likelihood estimation
In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.
Bayes estimator
In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss). Equivalently, it maximizes the posterior expectation of a utility function. An alternative way of formulating an estimator within Bayesian statistics is maximum a posteriori estimation. Suppose an unknown parameter is known to have a prior distribution .
Show more
Related publications (31)

Generalized Bradley-Terry Models for Score Estimation from Paired Comparisons

Julien René Pierre Fageot, Sadegh Farhadkhani, Oscar Jean Olivier Villemaud, Le Nguyen Hoang

Many applications, e.g. in content recommendation, sports, or recruitment, leverage the comparisons of alternatives to score those alternatives. The classical Bradley-Terry model and its variants have been widely used to do so. The historical model conside ...
AAAI Press2024

Regularization for distributionally robust state estimation and prediction

Giancarlo Ferrari Trecate, Florian Dörfler, Jean-Sébastien Hubert Brouillon

The increasing availability of sensing techniques provides a great opportunity for engineers to design state estimation methods, which are optimal for the system under observation and the observed noise patterns. However, these patterns often do not fulfil ...
2023

Data-Driven Unknown-Input Observers and State Estimation

Giancarlo Ferrari Trecate, Mustafa Sahin Turan

Unknown-input observers (UIOs) allow for estimation of the states of an LTI system without knowledge of all inputs. In this letter, we provide a novel data-driven UIO based on behavioral system theory and the result known as Fundamental Lemma proposed by J ...
2022
Show more