Goto

Collaborating Authors

 lpuq


Solving stochastic partial differential equations using neural networks in the Wiener chaos expansion

Neufeld, Ariel, Schmocker, Philipp

arXiv.org Machine Learning

In this paper, we solve stochastic partial differential equations (SPDEs) numerically by using (possibly random) neural networks in the truncated Wiener chaos expansion of their corresponding solution. Moreover, we provide some approximation rates for learning the solution of SPDEs with additive and/or multiplicative noise. Finally, we apply our results in numerical examples to approximate the solution of three SPDEs: the stochastic heat equation, the Heath-Jarrow-Morton equation, and the Zakai equation.


Universal approximation results for neural networks with non-polynomial activation function over non-compact domains

Neufeld, Ariel, Schmocker, Philipp

arXiv.org Machine Learning

More precisely, by assuming that the activation function is non-polynomial, we derive universal approximation results for neural networks within function spaces over non-compact subsets of a Euclidean space, e.g., weighted spaces, L Furthermore, we provide some dimension-independent rates for approximating a function with sufficiently regular and integrable Fourier transform by neural networks with non-polynomial activation function. Inspired by the functionality of human brains, (artificial) neural networks have been discovered in the seminal work of McCulloch and Pitts (see [32]). Fundamentally, a neural network consists of nodes arranged in hierarchical layers, where the connections between adjacent layers transmit the data through the network and the nodes transform this information. In mathematical terms, a neural network can therefore be described as a concatenation of affine and non-affine functions. Nowadays, neural networks are successfully applied in the fields of image classification (see e.g.