Article Source
- Learning operators using deep neural networks for multiphysics, multiscale, & multifidelity problems
Learning operators using deep neural networks for multiphysics, multiscale, & multifidelity problems
- e-Seminar on Scientific Machine Learning
- Speaker: Prof. Lu Lu (University of Pennsylvania)
Abstract
It is widely known that neural networks (NNs) are universal approximators of continuous functions. However, a less known but powerful result is that a NN can accurately approximate any nonlinear continuous operator. This universal approximation theorem of operators is suggestive of the structure and potential of deep neural networks (DNNs) in learning continuous operators or complex systems from streams of scattered data. In this talk, I will present the deep operator network (DeepONet) to learn various explicit operators, such as integrals and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. I will also present several extensions of DeepONet, such as DeepM&Mnet for multiphysics problems, DeepONet with proper orthogonal decomposition (POD-DeepONet), MIONet for multiple-input operators, and multifidelity DeepONet. More generally, DeepONet can learn multiscale operators spanning across many scales and trained by diverse sources of data simultaneously. I will demonstrate the effectiveness of DeepONet and its extensions to diverse multiphysics and multiscale problems, such as nanoscale heat transport, bubble growth dynamics, high-speed boundary layers, electroconvection, and hypersonics.