Publication

A Methodology for Global Sensitivity Analysis for Time-Dependent Code Output Applied to Reflood Simulation in TRACE

Abstract

Understanding properly the impact of model parameters and their interactions on the predictions is required for an appropriate model assessment. For simulation of reflooding following a LOCA, this requirement is justified because the important parameters affecting predictions often cannot be measured and should be considered uncertain. Moreover, different process representations in the model may have strong influence at different times of the transient. These issues complicate the task of model assessment. In this work, a global sensitivity analysis (GSA) methodology tailored to analyze transient code output is developed. The methodology was based on the Morris and the Sobol’ methods. The first was utilized to screen out non-influential parameters which allowed generation of larger samples with fewer code runs for the second method, the Sobol’ method. The method results in global sensitivity indices quantifying the contribution of inputs variations to the output variation, considering interactions among them. The Functional Data Analysis (FDA) techniques were then used to post-process the output deriving new quantities of interest (QoI)s describing the overall functional variation. The method was successfully applied to a reflood simulation model using the thermal-hydraulics (TH) system code TRACE with 26 input parameters. Complementing the conventional QoIs (such as the max. temperature and time of quenching), the FDA-derived quantities gave a deeper insight on particular modes of functional variation and attributing them to the variations of the model parameters. The temperature transient was divided into two modes: the ramp phase and the descent phase. The ramp phase showed that the model was additive in terms of the parameter variations with Dispersed Film Flow Boiling-related parameters explained most of the functional output variation. Yet, the variation during the descent could only be explained through parameters interactions. This indicated the non-identifiability of the model for that specific phase of the transient with respect to the temperature response.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (34)
Total variation denoising
In signal processing, particularly , total variation denoising, also known as total variation regularization or total variation filtering, is a noise removal process (filter). It is based on the principle that signals with excessive and possibly spurious detail have high total variation, that is, the integral of the absolute is high. According to this principle, reducing the total variation of the signal—subject to it being a close match to the original signal—removes unwanted detail whilst preserving important details such as .
Total variation
In mathematics, the total variation identifies several slightly different concepts, related to the (local or global) structure of the codomain of a function or a measure. For a real-valued continuous function f, defined on an interval [a, b] ⊂ R, its total variation on the interval of definition is a measure of the one-dimensional arclength of the curve with parametric equation x ↦ f(x), for x ∈ [a, b]. Functions whose total variation is finite are called functions of bounded variation.
Simple random sample
In statistics, a simple random sample (or SRS) is a subset of individuals (a sample) chosen from a larger set (a population) in which a subset of individuals are chosen randomly, all with the same probability. It is a process of selecting a sample in a random way. In SRS, each subset of k individuals has the same probability of being chosen for the sample as any other subset of k individuals. A simple random sample is an unbiased sampling technique. Simple random sampling is a basic type of sampling and can be a component of other more complex sampling methods.
Show more
Related publications (37)

Quantifying the Unknown: Data-Driven Approaches and Applications in Energy Systems

Paul Scharnhorst

In light of the challenges posed by climate change and the goals of the Paris Agreement, electricity generation is shifting to a more renewable and decentralized pattern, while the operation of systems like buildings is increasingly electrified. This calls ...
EPFL2024

Traversing Time Dependent Light Fields for Daylight Glare Evaluation

Stephen William Wasilewski

To understand how daylight gives shape and life to architectural spaces, whether existing or imagined, requires quantifying its dynamism and energy. Maintaining these details presents a challenge to simulation and analysis methods that flatten data into di ...
EPFL2023

Meta-Learners for Estimation of Causal Effects: Finite Sample Cross-Fit Performance

Gabriel Okasa

Estimation of causal effects using machine learning methods has become an active research field in econometrics. In this paper, we study the finite sample performance of meta-learners for estimation of heterogeneous treatment effects under the usage of sam ...
2022
Show more
Related MOOCs (32)
Digital Signal Processing I
Basic signal processing concepts, Fourier analysis and filters. This module can be used as a starting point or a basic refresher in elementary DSP
Digital Signal Processing II
Adaptive signal processing, A/D and D/A. This module provides the basic tools for adaptive filtering and a solid mathematical framework for sampling and quantization
Digital Signal Processing III
Advanced topics: this module covers real-time audio processing (with examples on a hardware board), image processing and communication system design.
Show more