Activation thresholds and expressiveness of polynomial neural networks

Finkel, Bella, Rodriguez, Jose Israel, Wu, Chenxi, Yahl, Thomas

arXiv.org Artificial Intelligence 

Polynomial neural networks are important in applications and theoretical machine learning. The function spaces and dimensions of neurovarieties for deep linear networks have been studied, and new developments in the polynomial neural network setting have appeared. In particular, results on the choice of the activation degree and the dimension of the neurovariety have improved our understanding of the optimization process of these neural networks and the ability of shallow and deep neural networks to replicate target functions [21, 27]. These theoretical results possess relevant implications. For appropriate datasets, polynomial activation functions can reduce model complexity and computational costs by introducing higher-order interactions between inputs, making it possible to model non-linear phenomena more efficiently. Moreover, polynomial neural networks have been found to perform well in practice in high-impact fields such as healthcare and finance.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found