Improving Performance in Neural Networks by Dendrites-Activated Connections
Metta, Carlo, Fantozzi, Marco, Papini, Andrea, Amato, Gianluca, Bergamaschi, Matteo, Galfrè, Silvia Giulia, Marchetti, Alessandro, Vegliò, Michelangelo, Parton, Maurizio, Morandin, Francesco
–arXiv.org Artificial Intelligence
Computational units in artificial neural networks compute a linear combination of their inputs, and then apply a nonlinear filter, often a ReLU shifted by some bias, and if the inputs come themselves from other units, they were already filtered with their own biases. In a layer, multiple units share the same inputs, and each input was filtered with a unique bias, resulting in output values being based on shared input biases rather than individual optimal ones. To mitigate this issue, we introduce DAC, a new computational unit based on preactivation and multiple biases, where input signals undergo independent nonlinear filtering before the linear combination. We provide a Keras implementation and report its computational efficiency. We test DAC convolutions in ResNet architectures on CIFAR-10, CIFAR-100, Imagenette, and Imagewoof, and achieve performance improvements of up to 1.73%. We exhibit examples where DAC is more efficient than its standard counterpart as a function approximator, and we prove a universal representation theorem.
arXiv.org Artificial Intelligence
Feb-12-2023
- Country:
- North America > United States > New York > New York County > New York City (0.28)
- Genre:
- Research Report (0.50)
- Technology: