**Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?**

Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur GraphSearch.

Publication# Continuous-Time AR Model Identification: Does Sampling Rate Really Matter?

Résumé

We address the problem of identifying continuous-time auto regressive (CAR) models from sampled data. The exponential nature of CAR autocorrelation functions is taken into account by means of exponential B-splines modelling, allowing one to associate the available digital data with a CAR model. A maximum likelihood (ML) estimator is then derived for identifying the optimal parameters; it relies on an exact discretization of the sampled version of the continuous-time model. We provide both time- and frequency-domain interpretations of the proposed estimator, while introducing a weighting function that describes the CAR power spectrum by means of discrete Fourier transform values. We present experimental results demonstrating that the proposed exponential-based ML estimator outperforms currently available polynomial-based methods, while achieving Cramér-Rao lower bound values even for relatively low sampling rates.

Official source

Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.

Concepts associés

Chargement

Publications associées

Chargement

Publications associées (7)

Chargement

Chargement

Chargement

Concepts associés (17)

Donnée

Une donnée est ce qui est connu et qui sert de point de départ à un raisonnement ayant pour objet la détermination d'une solution à un problème en relation avec cette donnée. Cela peut être une descr

Maximum de vraisemblance

En statistique, l'estimateur du maximum de vraisemblance est un estimateur statistique utilisé pour inférer les paramètres de la loi de probabilité d'un échantillon donné en recherchant les valeurs

Échantillonnage (signal)

L'échantillonnage consiste à prélever les valeurs d'un signal à intervalles définis, généralement réguliers. Il produit une suite de valeurs discrètes nommées échantillons.
Généralités
L

Hagay Kirshner, Simona Maggio, Michaël Unser

The problem of estimating continuous-domain autoregressive moving-average processes from sampled data is considered. The proposed approach incorporates the sampling process into the problem formulation while introducing exponential models for both the continuous and the sampled processes. We derive an exact evaluation of the discrete-domain power-spectrum using exponential B-splines and further suggest an estimation approach that is based on digitally filtering the available data. The proposed functional, which is related to Whittle's likelihood function, exhibits several local minima that originate from aliasing. The global minimum, however, corresponds to a maximum-likelihood estimator, regardless of the sampling step. Experimental results indicate that the proposed approach closely follows the Cramer-Rao bound for various aliasing configurations.

,

In recent years, chemometric methods for the analysis of multivariate kinetic data have considerably progressed. Kinetic hard-modelling is one of these methods that is based on the rate law and used to determine the kinetic parameters (e.g. rate constants) of chemical reactions by non-linear optimisation. Applied to spectroscopy, kinetic hard-modelling relies on Beer’s law to decompose the time and wavelength resolved data into the concentration profiles and the molar spectra of the pure components. In direct implicit kinetic hardmodelling, the concentration profiles are obtained by numerical integration of the rate laws and pure spectra are linearly estimated at each iteration using the pseudo-inverse of the concentration matrix [1]. Direct implicit kinetic hard-modelling of spectroscopic data allows the validation of the kinetic mechanism by comparing the estimated component spectra with independently measured ones. A severe drawback, however, is that this implicit method fails when concentrations profiles are linearly dependent, as the pseudo-inverse and thus the component spectra cannot be computed. Different Strategies have been proposed as a remedy to this rank deficiency problem, such as (1) defining some absorbing species as uncoloured, (2) providing some component spectra for the analysis, (3) dosing one or more species or (4) analysing simultaneously several experiments recorded under different initial concentrations (3-way analysis). In absence of a systematic method, the appropriate species to be included in these four Strategies are selected by experience or trial and error. Spectral validation of the kinetic mechanism can also be difficult when Strategy (1) is employed, as the fitted component spectra are complex linear combinations of the true pure spectra. We have recently proposed a systematic method for the experimental and data analytical design of kinetic data measured by spectroscopy that allows identifying the species to be incorporated in Strategies (1) – (4) and calculating the linear combinations of the true pure spectra when Strategy (1) is used, an important step for spectral validation [2]. This systematic method is based on a time-invariant matrix that avoids the numerical integration of the time-variant concentration profiles and allows the experimental design of chemical reactions, even if the associated rate constants are not yet known, i.e. before optimisation. This time-invariant matrix uses the entire set of kinetic reactions and only requires a reduction to independent reactions if linear combinations of the true pure spectra are desired (Strategy 1). For this, a method has also been developed. In this presentation, the systematic method of species selection is presented using simulated data and, from this, appropriate experimental designs (Strategies) are suggested. The method is also presented using experimental results obtained from the reaction of benzophenone with phenylhydrazine in THF (under catalysis of acetic acid), for which the postulated kinetic mechanism has been successfully validated via the comparison between fitted and measured component spectra in mid-IR and UV-vis [3]. [1] M. Maeder, A.D. Zuberbühler, Anal. Chem., 62 (1990), 2220-2224. [2] J. Billeter, Y.M. Neuhold, K. Hungerbühler, Chemom. Intell. Lab. Syst., 95 (2009), 170-187. [3] J. Billeter, Y.M. Neuhold, K. Hungerbühler, Chemom. Intell. Lab. Syst., 98 (2009), 213-226.

2009This work is concerned with the estimation of the spreading potential of the disease in the initial stages of an epidemic. A speedy and accurate estimation is important for determining whether or not interventions are necessary to prevent a major outbreak. At the same time, the information available in the early stages is scarce and data collection imperfect. We consider an epidemic in a large susceptible population, and address the estimation based on temporally aggregated counts of new cases that are subject to unknown random under-reporting. We allow for an influence of the detection process on the evolution of the epidemic. While the proportion of infectious individuals in the population is small, the role of chance in the spread of the disease may be substantial. Therefore, stochastic epidemic models are applied. As these are difficult to analyse, the time evolution of the number of infectious individuals is approximated by branching processes. We study the estimation in a partially observed Galton–Watson process and in a partially observed linear birth and death process; and in each case focus on the parameter characterising the growth of the process. We aim at estimators that perform well in the asymptotic sense where a single trajectory is observed over a long period of time, and study the asymptotics conditionally on the eventual explosion of the process. The partially observed Galton–Watson process has been recently proposed in the literature as a model for the initial stages of an epidemic. Its probabilistic structure has been explored and estimation has been partially addressed, in that consistent estimators have been constructed. However, the estimation-related uncertainty has not been evaluated. We address this issue here by constructing estimators that are motivated from the asymptotic dependence structure of the process. We show that they are consistent and asymptotically normal, consistently estimate their asymptotic variances, and construct asymptotic confidence intervals. In addition, we evaluate their finite-sample performance in a simulation study and their practical performance on real data. The observation mechanism in the partially observed Galton–Watson process is inherently discrete. To allow for continuous-time dynamics, we incorporate partial observation in the linear birth and death process. In particular, we propose a model where the birth process is completely unobserved, while a random proportion of the death process is observed at discrete time points. We study the estimation in this model. Motivated by its counting process structure, we arrive at consistent and asymptotically normal estimators, consistently estimate their asymptotic variances, and construct asymptotic confidence intervals. We also evaluate the finite-sample and practical performance of the estimators in a simulation study and on real data.