Machine Learning: Regularization Techniques

#artificialintelligence 

A sufficiently complex neural network can result in has 100% accuracy on the data it was trained with, but significant error on any new data. When this occurs, the network is likely overfitting the training data. This means that it makes predictions that are too strongly attached to features it learned in training, but which don't necessarily correlate with the expected results. One way to temper overfitting is by using a process called regularization. Regularization generally works by penalizing a neural network for complexity.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found