Publication

MATHICSE Technical Report: Non-intrusive double-greedy parametric model reduction by interpolation of frequency-domain rational surrogates

Fabio Nobile, Davide Pradovera
2020
Report or working paper
Abstract

We propose a model order reduction approach for non-intrusive surrogate modeling of parametric dynamical systems. The reduced model over the whole parameter space is built by combining surrogates in frequency only, built at few selected values of the parameters. This, in particular, requires matching the respective poles by solving an optimization problem. We detail how to treat unbalanced cases, where the surrogates to be combined have a different number of poles. If the frequency surrogates are constructed by minimal rational interpolation, frequency and parameters can both be sampled in a greedy fashion, by employing a fully non-intrusive "look-ahead" strategy. We explain how our proposed technique can be applied even in a high-dimensional setting, by employing locally-refined sparse grids to weaken the curse of dimensionality. Numerical examples are used to showcase the effectiveness of the method, and to highlight some of its limitations in dealing with unbalanced matching, as well as with a large number of parameters.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (32)
Optimization problem
In mathematics, computer science and economics, an optimization problem is the problem of finding the best solution from all feasible solutions. Optimization problems can be divided into two categories, depending on whether the variables are continuous or discrete: An optimization problem with discrete variables is known as a discrete optimization, in which an object such as an integer, permutation or graph must be found from a countable set.
Convex optimization
Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.
Combinatorial optimization
Combinatorial optimization is a subfield of mathematical optimization that consists of finding an optimal object from a finite set of objects, where the set of feasible solutions is discrete or can be reduced to a discrete set. Typical combinatorial optimization problems are the travelling salesman problem ("TSP"), the minimum spanning tree problem ("MST"), and the knapsack problem. In many such problems, such as the ones previously mentioned, exhaustive search is not tractable, and so specialized algorithms that quickly rule out large parts of the search space or approximation algorithms must be resorted to instead.
Show more
Related publications (58)

Relaxing the Additivity Constraints in Decentralized No-Regret High-Dimensional Bayesian Optimization

Patrick Thiran

Bayesian Optimization (BO) is typically used to optimize an unknown function f that is noisy and costly to evaluate, by exploiting an acquisition function that must be maximized at each optimization step. Even if provably asymptotically optimal BO algorith ...
2024

Augmented Lagrangian Methods for Provable and Scalable Machine Learning

Mehmet Fatih Sahin

Non-convex constrained optimization problems have become a powerful framework for modeling a wide range of machine learning problems, with applications in k-means clustering, large- scale semidefinite programs (SDPs), and various other tasks. As the perfor ...
EPFL2023

Development of an optimization tool for the electricity production of the Enguri power plant in Georgia

Louise Hui Lin Vernet

The ongoing global warming situation has bolstered interests in developing and reinforcing green energy. One of the most promising fields is hydropower (Ahmad and Hossain, 2020 and Yazdi and Moridi, 2018). Many existing reservoirs have untapped potential t ...
2023
Show more
Related MOOCs (17)
Introduction to optimization on smooth manifolds: first order methods
Learn to optimize on smooth, nonlinear spaces: Join us to build your foundations (starting at "what is a manifold?") and confidently implement your first algorithm (Riemannian gradient descent).
Optimization: principles and algorithms - Linear optimization
Introduction to linear optimization, duality and the simplex algorithm.
Optimization: principles and algorithms - Linear optimization
Introduction to linear optimization, duality and the simplex algorithm.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.