Stop Thinking, Just Do!

Sungsoo Kim's Blog

Approximating functions, functionals and operators

tagsTags

13 January 2025


Article Source


Approximating functions, functionals and operators with neural networks

Abstract

George Karniadakis: Approximating functions, functionals and operators with neural networks for diverse applications

Abstract: We will review physics-informed neural network and summarize available extensions for applications in computational mechanics and beyond. We will also introduce new NNs that learn functionals and nonlinear operators from functions and corresponding responses for system identification. The universal approximation theorem of operators is suggestive of the potential of NNs in learning from scattered data any continuous operator or complex system. We first generalize the theorem to deep neural networks, and subsequently we apply it to design a new composite NN with small generalization error, the deep operator network (DeepONet), consisting of a NN for encoding the discrete input function space (branch net) and another NN for encoding the domain of the output functions (trunk net). We demonstrate that DeepONet can learn various explicit operators, e.g., integrals, Laplace transforms and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. More generally, DeepOnet can learn multiscale operators spanning across many scales and trained by diverse sources of data simultaneously.

Bio

George Karniadakis is from Crete. He received his S.M. and Ph.D. from Massachusetts Institute of Technology (1984/87). He was appointed Lecturer in the Department of Mechanical Engineering at MIT and subsequently he joined the Center for Turbulence Research at Stanford / Nasa Ames. He joined Princeton University as Assistant Professor in the Department of Mechanical and Aerospace Engineering and as Associate Faculty in the Program of Applied and Computational Mathematics. He was a Visiting Professor at Caltech in 1993 in the Aeronautics Department and joined Brown University as Associate Professor of Applied Mathematics in the Center for Fluid Mechanics in 1994. After becoming a full professor in 1996, he continued to be a Visiting Professor and Senior Lecturer of Ocean/Mechanical Engineering at MIT. He is an AAAS Fellow (2018-), Fellow of the Society for Industrial and Applied Mathematics (SIAM, 2010-), Fellow of the American Physical Society (APS, 2004-), Fellow of the American Society of Mechanical Engineers (ASME, 2003-) and Associate Fellow of the American Institute of Aeronautics and Astronautics (AIAA, 2006-). He received the SIAM/ACM Prize on Computational Science & Engineering (2021), the Alexander von Humboldt award in 2017, the SIAM Ralf E Kleinman award (2015), the J. Tinsley Oden Medal (2013), and the CFD award (2007) by the US Association in Computational Mechanics. His h-index is 115 and he has been cited over 61,000 times.

Physics-informed neural networks (PINNs)

Abstract

Speaker: Fergus Shone, PhD Researcher, University of Leeds

Do you have sparse, low-quality data, but a good understanding of the physical system you are modelling? Then physics-informed neural networks (PINNs) might be the machine learning tool for you!

Bio

Fergus is a PhD researcher working with the Centre for Computational Imaging and Simulation Technologies in Biomedicine (CISTIB) and the Leeds Institute of Cardiovascular and Metabolic Medicine (LICAMM) at the University of Leeds. His research interests lie in the fields of physics-informed machine learning and super-resolution of in vivo cardiac flow data.

Physics-informed Neural Networks: A new paradigm for learning physical laws

TIFR CAM Conference on PDE and Numerical Analysis (PDENA22)

  • Title: Physics-informed Neural Networks: A new paradigm for learning physical laws

  • Speaker : Ameya Jagtap (Brown University)

  • Date : April 30, 2022

  • https://vasu.tifrbng.res.in/pdena22


comments powered by Disqus