Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
Functional data are typically modeled as sample paths of smooth stochastic processes in order to mitigate the fact that they are often observed discretely and noisily, occasionally irregularly and sparsely. The smoothness assumption is imposed to allow for ...
Covariance estimation is ubiquitous in functional data analysis. Yet, the case of functional observations over multidimensional domains introduces computational and statistical challenges, rendering the standard methods effectively inapplicable. To address ...
Correspondence pruning aims to correctly remove false matches (outliers) from an initial set of putative correspondences. The pruning process is challenging since putative matches are typically extremely unbalanced, largely dominated by outliers, and the r ...
The spectral distribution plays a key role in the statistical modelling of multivariate extremes, as it defines the dependence structure of multivariate extreme-value distributions and characterizes the limiting distribution of the relative sizes of the co ...
Wasserstein distances are metrics on probability distributions inspired by the problem of optimal mass transportation. Roughly speaking, they measure the minimal effort required to reconfigure the probability mass of one distribution in order to recover th ...
The classical multivariate extreme-value theory concerns the modeling of extremes in a multivariate random sample, suggesting the use of max-stable distributions. In this work, the classical theory is extended to the case where aggregated data, such as max ...
The efficiency of stochastic particle schemes for large scale simulations relies on the ability to preserve a uniform distribution of particles in the whole physical domain. While simple particle split and merge algorithms have been considered previously, ...
Most current risk assessment for complex extreme events relies on catalogues of similar events, either historical or generated artificially. In the latter, no existing methods produce completely new events with mathematically justified extrapolation above ...
xtreme value analysis is concerned with the modelling of extreme events such as floods and heatwaves, which can have large impacts. Statistical modelling can be useful to better assess risks even if, due to scarcity of measurements, there is inherently ver ...
The increasing interest in using statistical extreme value theory to analyse environmental data is mainly driven by the large impact extreme events can have. A difficulty with spatial data is that most existing inference methods for asymptotically justifie ...