Goto

Collaborating Authors

 singularly


A Variational Physics-Informed Neural Network Framework Using Petrov-Galerkin Method for Solving Singularly Perturbed Boundary Value Problems

Kumar, Vijay, Singh, Gautam

arXiv.org Artificial Intelligence

This work proposes a Variational Physics-Informed Neural Network (VPINN) framework that integrates the Petrov-Galerkin formulation with deep neural networks (DNNs) for solving one-dimensional singularly perturbed boundary value problems (BVPs) and parabolic partial differential equations (PDEs) involving one or two small parameters. The method adopts a nonlinear approximation in which the trial space is defined by neural network functions, while the test space is constructed from hat functions. The weak formulation is constructed using localized test functions, with interface penalty terms introduced to enhance numerical stability and accurately capture boundary layers. Dirichlet boundary conditions are imposed via hard constraints, and source terms are computed using automatic differentiation. Numerical experiments on benchmark problems demonstrate the effectiveness of the proposed method, showing significantly improved accuracy in both the $L_2$ and maximum norms compared to the standard VPINN approach for one-dimensional singularly perturbed differential equations (SPDEs).


An efficient wavelet-based physics-informed neural networks for singularly perturbed problems

Pandey, Himanshu, Singh, Anshima, Behera, Ratikanta

arXiv.org Artificial Intelligence

Physics-informed neural networks (PINNs) are a class of deep learning models that utilize physics as differential equations to address complex problems, including ones that may involve limited data availability. However, tackling solutions of differential equations with oscillations or singular perturbations and shock-like structures becomes challenging for PINNs. Considering these challenges, we designed an efficient wavelet-based PINNs (W-PINNs) model to solve singularly perturbed differential equations. Here, we represent the solution in wavelet space using a family of smooth-compactly supported wavelets. This framework represents the solution of a differential equation with significantly fewer degrees of freedom while still retaining in capturing, identifying, and analyzing the local structure of complex physical phenomena. The architecture allows the training process to search for a solution within wavelet space, making the process faster and more accurate. The proposed model does not rely on automatic differentiations for derivatives involved in differential equations and does not require any prior information regarding the behavior of the solution, such as the location of abrupt features. Thus, through a strategic fusion of wavelets with PINNs, W-PINNs excel at capturing localized nonlinear information, making them well-suited for problems showing abrupt behavior in certain regions, such as singularly perturbed problems. The efficiency and accuracy of the proposed neural network model are demonstrated in various test problems, i.e., highly singularly perturbed nonlinear differential equations, the FitzHugh-Nagumo (FHN), and Predator-prey interaction models. The proposed design model exhibits impressive comparisons with traditional PINNs and the recently developed wavelet-based PINNs, which use wavelets as an activation function for solving nonlinear differential equations.


Singularly Perturbed Layered Control of Deformable Bodies

Molu, Lekan

arXiv.org Artificial Intelligence

Variable curvature modeling tools provide an accurate means of controlling infinite degrees-of-freedom deformable bodies and structures. However, their forward and inverse Newton-Euler dynamics are fraught with high computational costs. Assuming piecewise constant strains across discretized Cosserat rods imposed on the soft material, a composite two time-scale singularly perturbed nonlinear backstepping control scheme is here introduced. This is to alleviate the long computational times of the recursive Newton-Euler dynamics for soft structures. Our contribution is three-pronged: (i) we decompose the system's Newton-Euler dynamics to a two coupled sub-dynamics by introducing a perturbation parameter; (ii) we then prescribe a set of stabilizing controllers for regulating each subsystem's dynamics; and (iii) we study the interconnected singularly perturbed system and analyze its stability.


Less Emphasis on Difficult Layer Regions: Curriculum Learning for Singularly Perturbed Convection-Diffusion-Reaction Problems

Wang, Yufeng, Xu, Cong, Yang, Min, Zhang, Jin

arXiv.org Artificial Intelligence

Although Physics-Informed Neural Networks (PINNs) have been successfully applied in a wide variety of science and engineering fields, they can fail to accurately predict the underlying solution in slightly challenging convection-diffusion-reaction problems. In this paper, we investigate the reason of this failure from a domain distribution perspective, and identify that learning multi-scale fields simultaneously makes the network unable to advance its training and easily get stuck in poor local minima. We show that the widespread experience of sampling more collocation points in high-loss layer regions hardly help optimize and may even worsen the results. These findings motivate the development of a novel curriculum learning method that encourages neural networks to prioritize learning on easier non-layer regions while downplaying learning on harder layer regions. The proposed method helps PINNs automatically adjust the learning emphasis and thereby facilitate the optimization procedure. Numerical results on typical benchmark equations show that the proposed curriculum learning approach mitigates the failure modes of PINNs and can produce accurate results for very sharp boundary and interior layers. Our work reveals that for equations whose solutions have large scale differences, paying less attention to high-loss regions can be an effective strategy for learning them accurately.


Semi-analytic PINN methods for singularly perturbed boundary value problems

Gie, Gung-Min, Hong, Youngjoon, Jung, Chang-Yeol

arXiv.org Artificial Intelligence

We propose a new semi-analytic physics informed neural network (PINN) to solve singularly perturbed boundary value problems. The PINN is a scientific machine learning framework that offers a promising perspective for finding numerical solutions to partial differential equations. The PINNs have shown impressive performance in solving various differential equations including time-dependent and multi-dimensional equations involved in a complex geometry of the domain. However, when considering stiff differential equations, neural networks in general fail to capture the sharp transition of solutions, due to the spectral bias. To resolve this issue, here we develop the semi-analytic PINN methods, enriched by using the so-called corrector functions obtained from the boundary layer analysis. Our new enriched PINNs accurately predict numerical solutions to the singular perturbation problems. Numerical experiments include various types of singularly perturbed linear and nonlinear differential equations.


The CEO Of Nvidia Thinks That The Nintendo Switch Is 'Really Delightful'

Forbes - Tech

In a new interview with Nvidia CEO Jen-Hsun Huang, he talks about the Nintendo Switch and thinks that the console is "really delightful". Over at Venture Beat, they managed to talk with Nvidia's CEO about AI and all manner of other subjects. The interview is worth a read but one particular part stood out when it came to the Nintendo Switch. Nintendo Switch is a game console. That entire experience is going to be very Nintendo.


Quadratic-Type Lyapunov Functions for Competitive Neural Networks with Different Time-Scales

Meyer-Bäse, Anke

Neural Information Processing Systems

The dynamics of complex neural networks modelling the selforganization process in cortical maps must include the aspects of long and short-term memory. The behaviour of the network is such characterized by an equation of neural activity as a fast phenomenon and an equation of synaptic modification as a slow part of the neural system. We present a quadratic-type Lyapunov function for the flow of a competitive neural system with fast and slow dynamic variables. We also show the consequences of the stability analysis on the neural net parameters. 1 INTRODUCTION This paper investigates a special class of laterally inhibited neural networks. In particular, we have examined the dynamics of a restricted class of laterally inhibited neural networks from a rigorous analytic standpoint.


Quadratic-Type Lyapunov Functions for Competitive Neural Networks with Different Time-Scales

Meyer-Bäse, Anke

Neural Information Processing Systems

Anke Meyer-Base Institute of Technical Informatics Technical University of Darmstadt Darmstadt, Germany 64283 Abstract The dynamics of complex neural networks modelling the selforganization processin cortical maps must include the aspects of long and short-term memory. The behaviour of the network is such characterized by an equation of neural activity as a fast phenomenon andan equation of synaptic modification as a slow part of the neural system. We present a quadratic-type Lyapunov function for the flow of a competitive neural system with fast and slow dynamic variables. We also show the consequences of the stability analysis on the neural net parameters. 1 INTRODUCTION This paper investigates a special class of laterally inhibited neural networks. In particular, we have examined the dynamics of a restricted class of laterally inhibited neural networks from a rigorous analytic standpoint.


Quadratic-Type Lyapunov Functions for Competitive Neural Networks with Different Time-Scales

Meyer-Bäse, Anke

Neural Information Processing Systems

The dynamics of complex neural networks modelling the selforganization process in cortical maps must include the aspects of long and short-term memory. The behaviour of the network is such characterized by an equation of neural activity as a fast phenomenon and an equation of synaptic modification as a slow part of the neural system. We present a quadratic-type Lyapunov function for the flow of a competitive neural system with fast and slow dynamic variables. We also show the consequences of the stability analysis on the neural net parameters. 1 INTRODUCTION This paper investigates a special class of laterally inhibited neural networks. In particular, we have examined the dynamics of a restricted class of laterally inhibited neural networks from a rigorous analytic standpoint.