About this Abstract |
Meeting |
MS&T25: Materials Science & Technology
|
Symposium
|
Advances in Multiphysics Modeling and Multi-Modal Imaging of Functional Materials
|
Presentation Title |
Operator Learning Neural Scaling and Distributed Applications |
Author(s) |
Zecheng Zhang, Wenjing Liao, Hayden Schaeffer, Hao Liu, Guang Lin |
On-Site Speaker (Planned) |
Zecheng Zhang |
Abstract Scope |
In this talk, we will explore mathematical and scientific machine learning, with a particular focus on operator learning—a framework for approximating mappings between function spaces that has broad applications in PDE-related problems. We will begin by discussing the mathematical foundations of operator approximation, which inform the design of neural network architectures and provide a basis for analyzing the performance of trained models on test samples. Specifically, we will introduce the neural scaling law, which characterizes error convergence in relation to network size and generalization error in relation to training dataset size. Building on these theoretical insights, we will present a distributed learning algorithm designed to address a key computational challenge: efficiently handling heterogeneous problems where input functions exhibit vastly different properties. Such multiscale problems typically demand significant computational resources to capture fine-scale details, but our distributed approach enables efficient training with improved accuracy. |