A key challenge across many disciplines is to extract meaningful information from data which is often obscured by noise. These datasets are typically represented as large matrices. Given the current trend of ever-increasing data volumes, with datasets grow ...
In inverse problems, the task is to reconstruct an unknown signal from its possibly noise-corrupted measurements. Penalized-likelihood-based estimation and Bayesian estimation are two powerful statistical paradigms for the resolution of such problems. They ...
In certain cases of astronomical data analysis, the meaningful physical quantity to extract is the ratio R between two data sets. Examples include the lensing ratio, the interloper rate in spectroscopic redshift samples, and the decay rate of gravitational ...
Higher-order asymptotics provide accurate approximations for use in parametric statistical modelling. In this thesis, we investigate using higher-order approximations in two-specific settings, with a particular emphasis on the tangent exponential model....
This thesis presents work at the junction of statistics and climate science. We first provide methodology for use by climate scientists when performing fast event attribution using extreme value theory, and then describe two interdisciplinary projects in c ...
Outliers in discrete choice response data may result from misclassification and misreporting of the response variable and from choice behaviour that is inconsistent with modelling assumptions (e.g. random utility maximisation). In the presence of outliers, ...
Selection bias may arise when data have been chosen in a way that subsequent analysis does not account for. Such bias can arise in climate event attribution studies that are performed rapidly after a devastating "trigger event'', whose occurrence correspon ...
Universal inference enables the construction of confidence intervals and tests without regularity conditions by splitting the data into two parts and appealing to Markov's inequality. Previous investigations have shown that the cost of this generality is a ...
We study the problem of learning unknown parameters of stochastic dynamical models from data. Often, these models are high dimensional and contain several scales and complex structures. One is then interested in obtaining a reduced, coarse-grained descript ...
A logconcave likelihood is as important to proper statistical inference as a convex cost function is important to variational optimization. Quantization is often disregarded when writing likelihood models, ignoring the limitations of the physical detectors ...
IEEE Institute of Electrical and Electronics Engineers2022