Deep Learning Best Practices: Activation Functions & Weight Initialization Methods -- Part 1
One of the reasons that Deep learning has become more popular in the past decade is better learning algorithms which have to lead to faster convergence or better performance of neural networks in general. Along with better learning algorithms, Introduction of better activation functions, and better initialization methods help us to create better neural networks. Note: This article assumes that the reader has a basic understanding of Neural Network, weights, biases, and backpropagation. In this article, we discuss some of the commonly used activation functions and weight initialization methods while training a deep neural network. To be more specific, we will be covering the following.
May-22-2019, 02:49:36 GMT
- Technology: