Quantifying hydrological modeling errors through a mixture of normal distributions
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Displacement data modelling is of great importance for the safety control of concrete dams. The commonly used artificial intelligence method modelled the displacement data at each monitoring point individually, i.e., the data correlations between the monit ...
Two characteristics that make convex decomposition algorithms attractive are simplicity of operations and generation of parallelizable structures. In principle, these schemes require that all coordinates update at the same time, i.e., they are synchronous ...
The efficiency of stochastic particle schemes for large scale simulations relies on the ability to preserve a uniform distribution of particles in the whole physical domain. While simple particle split and merge algorithms have been considered previously, ...
Nuclear thermal-hydraulics (TH) system codes use several parametrized physical or empirical models to describe complex two-phase flow phenomena. The reliability of their predictions is as such primarily affected by the uncertainty associated with the param ...
A persistent obstacle for constructing kinetic models of metabolism is uncertainty in the kinetic properties of enzymes. Currently, available methods for building kinetic models can cope indirectly with uncertainties by integrating data from different biol ...
xtreme value analysis is concerned with the modelling of extreme events such as floods and heatwaves, which can have large impacts. Statistical modelling can be useful to better assess risks even if, due to scarcity of measurements, there is inherently ver ...
A novel approach is presented for constructing polynomial chaos representations of scalar quantities of interest (QoI) that extends previously developed methods for adaptation in Homogeneous Chaos spaces. In this work, we develop a Bayesian formulation of ...
Rigorous back analysis plays a major role in providing information to engineers for better decision-making. Most research on this topic has focused on optimization techniques. Comparative studies of data interpretation methodologies have seldom been report ...
We introduce an online outlier detection algorithm to detect outliers in a sequentially observed data stream. For this purpose, we use a two-stage filtering and hedging approach. In the first stage, we construct a multimodal probability density function to ...
A well-known first-order method for sampling from log-concave probability distributions is the Unadjusted Langevin Algorithm (ULA). This work proposes a new annealing step-size schedule for ULA, which allows to prove new convergence guarantees for sampling ...