Statistical Inference in Positron Emission Tomography
Related publications (40)
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Covariance operators are fundamental in functional data analysis, providing the canonical means to analyse functional variation via the celebrated Karhunen-Loeve expansion. These operators may themselves be subject to variation, for instance in contexts wh ...
Humans are comparison machines: comparing and choosing an item among a set of alternatives (such as objects or concepts) is arguably one of the most natural ways for us to express our preferences and opinions. In many applications, the analysis of data con ...
Wasserstein distances are metrics on probability distributions inspired by the problem of optimal mass transportation. Roughly speaking, they measure the minimal effort required to reconfigure the probability mass of one distribution in order to recover th ...
The increasing interest in using statistical extreme value theory to analyse environmental data is mainly driven by the large impact extreme events can have. A difficulty with spatial data is that most existing inference methods for asymptotically justifie ...
PurposeThe aims of this multicentre retrospective study of locally advanced head and neck cancer (LAHNC) treated with definitive radiotherapy were to (1) identify positron emission tomography (PET)-F-18-fluorodeoxyglucose (F-18-FDG) parameters correlated w ...
We consider the inference problem for parameters in stochastic differential equation models from discrete time observations (e.g. experimental or simulation data). Specifically, we study the case where one does not have access to observations of the model ...
Most current risk assessment for complex extreme events relies on catalogues of similar events, either historical or generated artificially. In the latter, no existing methods produce completely new events with mathematically justified extrapolation above ...
Extreme events are responsible for huge material damage and are costly in terms of their human and economic impacts. They strike all facets of modern society, such as physical infrastructure and insurance companies through environmental hazards, banking an ...
Natural populations present an abundant genetic variability. Like mutation or natural se- lection, dierent processes are at stake to generate this variability. Population genetics is a topic that emerged in the late 40's, thanks mainly to the biologists Fi ...
Focused beamformers have been extensively used in phased-array signal processing, leading to simple and efficient imaging procedures, with high sensitivity and resolution. The beamshape acts as a spatial filter, scanning the intensity of the incoming signa ...