**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# General state-space models

Abstract

In time series analysis state-space models provide a wide and flexible class. The basic idea is to describe an unobservable phenomenon of interest on the basis of noisy data. The first constituent of such a model is the so-called state equation, which characterises the unobserved state of the system. The second part is the so-called observation equation, which describes the observable variables as a function of the unobserved state. The purpose of the analysis of state-space models is to infer the relevant properties of the unobserved phenomenon from the observed data. A powerful tool to do so in the linear Gaussian setting is the Kalman filter. It provides a simple recursive computational scheme for the conditional expectation of the unobserved state given the observed data. However, since the Kalman filter is is linear in the data it is sensitive to outlying observations. A certain robustness of the procedure can easily be achieved by abandoning the Gaussianity assumption. On the other hand, the assumption of linearity is frequently found to be inadequate in practical situations and therefore needs to be generalised as well. However, without linearity and Gaussianity the simple recursive formulae of the Kalman filter are lost and the recursive computation of the conditional expectation must be replaced by approximations of ratios of high dimensional integral. Numerical and Monte Carlo integration techniques have been proposed for this purpose. The former yield powerful tools in low dimensional problems but suffer in high dimensional settings from the resulting computational burden. On the other hand Monte Carlo integration techniques lead to so-called particle filters, which provide computationally attractive tools even applicable in high dimensional problems. Although particle filters have some appealing properties they are not entirely convincing: the approximation error is relatively large; outliers may significantly affect the computational efficiency and in practical situations it is often very difficult to tune them adequately. Hence, we propose to adopt another integration method, the so-called scrambled net integration – a hybrid of numerical and Monte Carlo integration techniques. For the non-iterative approximation of integrals, this method has been proven to be often much more accurate than Monte Carlo methods as well as to deal well with the computational burden in high dimensions. We implement scrambled net integration in the general Kalman recursions and derive new prediction, filtering and smoothing algorithms. Further, we show that the approximation error remains stable during iterations and that the accuracy of the resulting state estimates is higher than that of particle filters. The resulting algorithms are more robust than the particle filter algorithm since their computational complexity is not influenced by outlying observations. In addition, we prove a central limit theorem for the estimator of the filter's expectation. Scrambled nets can also be used for hyper-parameter estimation since the likelihood of a general state-space model is a byproduct of the general Kalman recursions. They not only provide an approximation to the likelihood function but also a suitable grid over which it is maximised. The resulting estimation procedure is particularly attractive when dealing with high dimensional hyper-parameter spaces. Finally, one of the main difficulties when studying the properties of an unobserved phenomenon is designing a good general state-space model. Besides the determination of state and observation equations the problem of choosing appropriate noise distributions is crucial since they express the inherent uncertainty of the model used. From a practical point of view it is frequently completely unclear how to choose these noise distributions. Rather than relying on an uneducated guess of a single distributional model we propose to use several fundamentally different distributions and derive an adaptive filter providing a data-driven approach, which is naturally robust with respect to deviations from a distributional model.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (6)

State-space representation

In control engineering, model based fault detection and system identification a state-space representation is a mathematical model of a physical system specified as a set of input, output and variables related by first-order (not involving second derivatives) differential equations or difference equations. Such variables, called state variables, evolve over time in a way that depends on the values they have at any given instant and on the externally imposed values of input variables.

Kalman filter

For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estimates of unknown variables that tend to be more accurate than those based on a single measurement alone, by estimating a joint probability distribution over the variables for each timeframe. The filter is named after Rudolf E. Kálmán, who was one of the primary developers of its theory.

Monte Carlo method

Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.