Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Information collected through sensor measurements has the potential to improve knowledge of complex-system behavior, leading to better decisions related to system management. In this situation, and particularly when using digital twins, the quality of sensor data determines the improvement that sensors have on decision-making. The choice of the monitoring system, including sensor types and their configuration, is typically made using engineering judgement alone. As the price of sensor devices is usually low, large sensor networks have been implemented. As sensors are often used to monitor at high frequencies over long periods, very large data sets are collected. However, model predictions of system behavior are often influenced by only a few parameters. Informative data sets are thus difficult to extract as they are often hidden amid redundant and other types of irrelevant data when updating key parameter values. This study presents a methodology for selecting informative measurements within large data sets for a given model-updating task. By selecting the smallest set that maximizes the information gain, data sets can be significantly refined, leading to increased data-interpretation efficiency. Results of an excavation case study show that the information gains with refined measurement sets that are much smaller than the entire data set are better than using the data set prior to refinement for the same probability of identification, while the computational time of model updating is significantly reduced. This methodology thus supports engineers for significant data filtering to improve model-updating performance.
Ian Smith, Numa Joy Bertola, Sai Ganesh Sarvotham Pai
Simon François Dumas Primbault, Pierre Henri Marcel Mounier