Deep Learning from first principles in Python, R and Octave – Part 6
"Today you are You, that is truer than true. There is no one alive who is Youer than You." "Explanations exist; they have existed for all time; there is always a well-known solution to every human problem -- neat, plausible, and wrong." In this 6th instalment of'Deep Learning from first principles in Python, R and Octave-Part6', I look at a couple of different initialization techniques used in Deep Learning, L2 regularization and the'dropout' method. Specifically, I implement "He initialization" & "Xavier Initialization". The implementation was in vectorized Python, R and Octave 3. Part 3 -In part 3, I derive the equations and also implement a L-Layer Deep Learning network with either the relu, tanh or sigmoid activation function in Python, R and Octave.
Apr-18-2018, 19:21:29 GMT
- Technology: