Approximation of RKHS Functionals by Neural Networks
Zhou, Tian-Yi, Suh, Namjoon, Cheng, Guang, Huo, Xiaoming
This paper studies the approximation of smooth functionals defined over a reproducing kernel Hilbert space (RKHS) using tanh neural networks. A functional maps from a space of functions that has infinite dimensions to R. In recent years, neural networks have been widely employed in operator learning tasks. We are interested in investigating their capability to approximate nonlinear functionals, a special type of operator. Neural networks have been known as universal approximators since [Cybenko, 1989], i.e., to approximate any continuous function, mapping a finite-dimensional input space into another finite-dimensional output space, to arbitrary accuracy. These days, many interesting tasks entail learning operators, i.e., mappings between an infinite-dimensional input Banach space and (possibly) an infinite-dimensional output space. A prototypical example in scientific computing is to map the initial datum into the (time series of) solution of a nonlinear time-dependent partial differential equation (PDE). A priori, it is unclear if neural networks can be successfully employed to learn such operators from data, given that their universality only pertains to finite-dimensional functions. One of the first successful uses of neural networks in the context of operator learning was provided by [Chen and Chen, 1995].
Mar-18-2024
- Country:
- Asia > Japan
- Honshū > Kantō > Kanagawa Prefecture (0.04)
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States (0.46)
- Asia > Japan
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Technology: