TSSR: A Truncated and Signed Square Root Activation Function for Neural Networks
–arXiv.org Artificial Intelligence
Activation functions are essential components of neural networks. In this paper, we introduce a new activation function called the Truncated and Signed Square Root (TSSR) function. This function is distinctive because it is odd, nonlinear, monotone and differentiable. Its gradient is continuous and always positive. Thanks to these properties, it has the potential to improve the numerical stability of neural networks. Several experiments confirm that the proposed TSSR has better performance than other stat-of-the-art activation functions. The proposed function has significant implications for the development of neural network models and can be applied to a wide range of applications in fields such as computer vision, natural language processing, and speech recognition.
arXiv.org Artificial Intelligence
Aug-9-2023
- Country:
- Asia
- China > Guangdong Province
- Shenzhen (0.04)
- Singapore (0.04)
- China > Guangdong Province
- Europe
- France > Nouvelle-Aquitaine
- Switzerland > Zürich
- Zürich (0.04)
- United Kingdom > England
- East Sussex > Brighton (0.04)
- North America > United States
- Wisconsin > Dane County > Madison (0.04)
- Asia
- Genre:
- Research Report (0.64)
- Technology: