Deep Learning from first principles in Python, R and Octave – Part 5
In this 5th part on Deep Learning from first Principles in Python, R and Octave, I solve the MNIST data set of handwritten digits (shown below), from the basics. To do this, I construct a L-Layer, vectorized Deep Learning implementation in Python, R and Octave from scratch and classify the MNIST data set. The MNIST training data set contains 60000 handwritten digits from 0-9, and a test set of 10000 digits. MNIST, is a popular dataset for running Deep Learning tests, and has been rightfully termed as the'drosophila' of Deep Learning, by none other than the venerable Prof Geoffrey Hinton. The'Deep Learning from first principles in Python, R and Octave' series, so far included Part 1, where I had implemented logistic regression as a simple Neural Network. Part 2 implemented the most elementary neural network with 1 hidden layer, but with any number of activation units in that layer, and a sigmoid activation at the output layer.
Mar-24-2018, 17:37:22 GMT
- Technology: