bernstein approximation
BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation
However, existing work either applies predefined filter weights or learns them without necessary constraints, which may lead to oversimplified or ill-posed filters. To overcome these issues, we propose $\textit{BernNet}$, a novel graph neural network with theoretical support that provides a simple but effective scheme for designing and learning arbitrary graph spectral filters. In particular, for any filter over the normalized Laplacian spectrum of a graph, our BernNet estimates it by an order-$K$ Bernstein polynomial approximation and designs its spectral property by setting the coefficients of the Bernstein basis. Moreover, we can learn the coefficients (and the corresponding filter weights) based on observed graphs and their associated signals and thus achieve the BernNet specialized for the data. Our experiments demonstrate that BernNet can learn arbitrary spectral filters, including complicated band-rejection and comb filters, and it achieves superior performance in real-world graph modeling tasks. Code is available at https://github.com/ivam-he/BernNet.
BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation
However, existing work either applies predefined filter weights or learns them without necessary constraints, which may lead to oversimplified or ill-posed filters. To overcome these issues, we propose \textit{BernNet}, a novel graph neural network with theoretical support that provides a simple but effective scheme for designing and learning arbitrary graph spectral filters. In particular, for any filter over the normalized Laplacian spectrum of a graph, our BernNet estimates it by an order- K Bernstein polynomial approximation and designs its spectral property by setting the coefficients of the Bernstein basis. Moreover, we can learn the coefficients (and the corresponding filter weights) based on observed graphs and their associated signals and thus achieve the BernNet specialized for the data. Our experiments demonstrate that BernNet can learn arbitrary spectral filters, including complicated band-rejection and comb filters, and it achieves superior performance in real-world graph modeling tasks.
POLAR: A Polynomial Arithmetic Framework for Verifying Neural-Network Controlled Systems
Huang, Chao, Fan, Jiameng, Wang, Zhilu, Wang, Yixuan, Zhou, Weichao, Li, Jiajun, Chen, Xin, Li, Wenchao, Zhu, Qi
We present POLAR, a polynomial arithmetic-based framework for efficient bounded-time reachability analysis of neural-network controlled systems (NNCSs). Existing approaches that leverage the standard Taylor Model (TM) arithmetic for approximating the neural-network controller cannot deal with non-differentiable activation functions and suffer from rapid explosion of the remainder when propagating the TMs. POLAR overcomes these shortcomings by integrating TM arithmetic with \textbf{Bernstein B{\'e}zier Form} and \textbf{symbolic remainder}. The former enables TM propagation across non-differentiable activation functions and local refinement of TMs, and the latter reduces error accumulation in the TM remainder for linear mappings in the network. Experimental results show that POLAR significantly outperforms the current state-of-the-art tools in terms of both efficiency and tightness of the reachable set overapproximation. The source code can be found in https://github.com/ChaoHuang2018/POLAR_Tool
Robust normalizing flows using Bernstein-type polynomials
Ramasinghe, Sameera, Fernando, Kasun, Khan, Salman, Barnes, Nick
We propose a framework to construct (Kobyzev et al., 2020). NFs based on increasing triangular maps and Bernstein-type polynomials. Compared to the In contrast, normalizing flows (NFs) are a category of generative existing (universal) NF frameworks, our method models that enable exact density computation and provides compelling advantages like theoretical efficient sampling. Since the seminal work by Rezende upper bounds for the approximation error, robustness, & Mohamed (2015), NFs have been gaining increasing attention higher interpretability, suitability for compactly from the machine learning community due to the supported densities, and the ability to employ attractive properties mentioned earlier. In the abstract, NFs higher degree polynomials without training consist of a series diffeomorphisms that transforms a simple instability. Moreover, we provide a constructive distribution into a more complex one, which in turn universality proof, which gives analytic expressions allows an analytical density estimation of samples. In the of the approximations for known transformations.
- North America > Canada > Ontario > Toronto (0.14)
- Europe > Ukraine > Kharkiv Oblast > Kharkiv (0.04)
- Europe > Italy > Calabria > Catanzaro Province > Catanzaro (0.04)