Mean-Field Limits for Two-Layer Neural Networks Trained with Consensus-Based Optimization

De Deyn, William, Herty, Michael, Samaey, Giovanni

arXiv.org Artificial Intelligence 

Artificial Intelligence has witnessed remarkable progress over the past decades, both in its capabilities and its range of applications. Today, neural networks are present in a variety of fields. One classical application is function approximation, which is supported by the universal approximation theory [34]. In computer vision, convolutional neural networks form the backbone of most modern architectures [39, 38], while the framework of neural ordinary differential equations has contributed significantly to optimal control problems [17, 10]. In natural language processing and speech recognition, recurrent neural networks and the long short-term memory variants have yielded significant performance improvements [33, 51]. More recently, diffusion models have illustrated to be powerful generative models, with applications ranging from image denoising to video generation [56]. Neural networks have even found their way into scientific computing. The most notable example is physics-informed neural networks, which are capable of solving both forward and inverse problems governed by partial differential equations [50]. A neural network can be viewed, in general, as a function parametrized by a set of weights and biases, which we collectively refer to as parameters.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found