LocalDrop: A Hybrid Regularization for Deep Neural Networks

Lu, Ziqing, Xu, Chang, Du, Bo, Ishida, Takashi, Zhang, Lefei, Sugiyama, Masashi

arXiv.org Artificial Intelligence 

Abstract--In neural networks, developing regularization algorithm s to settle overfitting is one of the major study areas. We prop ose a new approach for the regularization of neural networks by th e local Rademacher complexity called LocalDrop. A new regul arization function for both fully-connected networks (FCNs) and conv olutional neural networks (CNNs), including drop rates and weight matrices, has been developed based on the proposed upper bound of the lo cal Rademacher complexity by the strict mathematical deduc tion. The analyses of dropout in FCNs and DropBlock in CNNs with kee p rate matrices in different layers are also included in the c omplexity analyses. With the new regularization function, we establi sh a two-stage procedure to obtain the optimal keep rate matr ix and weight matrix to realize the whole training model. Extensive exper iments have been conducted to demonstrate the effectivenes s of LocalDrop in different models by comparing it with several algorithms and the effects of different hyperparameters on the final per formances. Neural networks have lately shown impressive performance i n sophisticated real-world situations, including image cla ssification [1], object recognition [2] and image captioning [3]. Low, m iddle and high level features are integrated into deep neural netw orks, which are usually trained in an end-to-end manner.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found