Activation thresholds and expressiveness of polynomial neural networks
Finkel, Bella, Rodriguez, Jose Israel, Wu, Chenxi, Yahl, Thomas
–arXiv.org Artificial Intelligence
Polynomial neural networks are important in applications and theoretical machine learning. The function spaces and dimensions of neurovarieties for deep linear networks have been studied, and new developments in the polynomial neural network setting have appeared. In particular, results on the choice of the activation degree and the dimension of the neurovariety have improved our understanding of the optimization process of these neural networks and the ability of shallow and deep neural networks to replicate target functions [21, 27]. These theoretical results possess relevant implications. For appropriate datasets, polynomial activation functions can reduce model complexity and computational costs by introducing higher-order interactions between inputs, making it possible to model non-linear phenomena more efficiently. Moreover, polynomial neural networks have been found to perform well in practice in high-impact fields such as healthcare and finance.
arXiv.org Artificial Intelligence
Aug-8-2024
- Country:
- Asia > Middle East
- Israel (0.04)
- North America > United States
- Massachusetts > Middlesex County
- Cambridge (0.04)
- New York > New York County
- New York City (0.04)
- Wisconsin > Dane County
- Madison (0.04)
- Massachusetts > Middlesex County
- South America > Argentina
- Patagonia > Río Negro Province > Viedma (0.04)
- Asia > Middle East
- Genre:
- Research Report (1.00)
- Industry:
- Health & Medicine (0.48)
- Technology: