Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition

Ciresan, Dan Claudiu, Meier, Ueli, Gambardella, Luca Maria, Schmidhuber, Juergen

arXiv.org Artificial Intelligence 

Good old on-line back-propagation for plain multi-layer perceptrons yields a very low 0.35% error rate on the famous MNIST handwritten digits benchmark. All we need to achieve this best result so far are many hidden layers, many neurons per layer, numerous deformed training images, and graphics cards to greatly speed up learning.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found