SFU MOCAD Seminar: Stefania Fresca
Topic
Handling geometric variability and multi-scale optimization in surrogate models
Speakers
Details
Solving differential problems using full order models (FOMs), such as the finite element method, can results in prohibitive computational costs, particularly in real-time simulations and multi-query routines. Surrogate modeling aims to replace FOMs with models characterized by much lower complexity but still able to express the physical features of the system under investigation.
In many applications, the available data are inherently multi-resolution, either due to geometric variability, where solutions are defined on parametrized domains, or due to the need to capture phenomena across different spatial scales. Motivated by this observation, two complementary approaches to surrogate modeling for parametrized PDEs are introduced and analyzed.
First, Continuous Geometry-Aware DL-ROMs (CGA-DL-ROMs) are introduced. The space-continuous formulation of the proposed architecture enables to deal with multi-resolution datasets, which commonly arise in the presence of geometrical parametrizations. Furthermore, CGA-DL-ROMs are endowed with a strong inductive bias that explicitly accounts for geometric parameters, allowing the distinct impact of geometric variability on the solution manifold to be captured. This geometrical awareness leads to improved compression properties and enhanced overall performance of the surrogate model.
Second, a Multi-Level Monte Carlo (MLMC) training strategy for operator learning is proposed, exploiting hierarchies of resolutions of function dicretizations. The approach combines inexpensive gradient estimates obtained from coarse-resolution data with corrective contributions from a limited number of fine-resolution samples, thereby reducing the overall training cost while preserving accuracy. The MLMC training framework is architecture-agnostic and applicable to any architecture capable of handling multi-resolution data. Numerical experiments highlight the existence of a Pareto trade-off between accuracy and computational cost governed by the distribution of samples across resolution levels.