Goto

Collaborating Authors

 autobss


747d3443e319a22747fbb873e8b2f9f2-Supplemental.pdf

Neural Information Processing Systems

It can be derived that the posterior processf|O is also a GP, we denote its mean function and21 kernel function asµn and κn respectively. To reduce the time consumption and take advantage of parallelization, we train several different32 networks at a time. When selecting the first BSSC, equation 2 can be used directly. Therefore, we use the expectedvalue of EI function (EEI, [4])instead. ResNet18/50 consists of 6 stages as illustrated in Figure 1.



AutoBSS: An Efficient Algorithm for Block Stacking Style Search

Neural Information Processing Systems

Neural network architecture design mostly focuses on the new convolutional operator or special topological structure of network block, little attention is drawn to the configuration of stacking each block, called Block Stacking Style (BSS). Recent studies show that BSS may also have an unneglectable impact on networks, thus we design an efficient algorithm to search it automatically. The proposed method, AutoBSS, is a novel AutoML algorithm based on Bayesian optimization by iteratively refining and clustering Block Stacking Style Code (BSSC), which can find optimal BSS in a few trials without biased evaluation.




Review for NeurIPS paper: AutoBSS: An Efficient Algorithm for Block Stacking Style Search

Neural Information Processing Systems

This paper initially got mixed recommendations, three positive and one negative. The reviewers agree that this paper addresses an important problem for neural network architecture design. The experiments are comprehensive and results are good. However, one reviewer has the concerns on the experimental justification and that gave a weak reject. This concern was addressed by the additional experiments in the authors' response.


AutoBSS: An Efficient Algorithm for Block Stacking Style Search

Neural Information Processing Systems

Neural network architecture design mostly focuses on the new convolutional operator or special topological structure of network block, little attention is drawn to the configuration of stacking each block, called Block Stacking Style (BSS). Recent studies show that BSS may also have an unneglectable impact on networks, thus we design an efficient algorithm to search it automatically. The proposed method, AutoBSS, is a novel AutoML algorithm based on Bayesian optimization by iteratively refining and clustering Block Stacking Style Code (BSSC), which can find optimal BSS in a few trials without biased evaluation. More importantly, experimental results on model compression, object detection and instance segmentation show the strong generalizability of the proposed AutoBSS, and further verify the unneglectable impact of BSS on neural networks.


AutoBSS: An Efficient Algorithm for Block Stacking Style Search

Zhang, Yikang, Zhang, Jian, Zhong, Zhao

arXiv.org Artificial Intelligence

Neural network architecture design mostly focuses on the new convolutional operator or special topological structure of network block, little attention is drawn to the configuration of stacking each block, called Block Stacking Style (BSS). Recent studies show that BSS may also have an unneglectable impact on networks, thus we design an efficient algorithm to search it automatically. The proposed method, AutoBSS, is a novel AutoML algorithm based on Bayesian optimization by iteratively refining and clustering Block Stacking Style Code (BSSC), which can find optimal BSS in a few trials without biased evaluation. On ImageNet classification task, ResNet50/MobileNetV2/EfficientNet-B0 with our searched BSS achieve 79.29%/74.5%/77.79%, which outperform the original baselines by a large margin. More importantly, experimental results on model compression, object detection and instance segmentation show the strong generalizability of the proposed AutoBSS, and further verify the unneglectable impact of BSS on neural networks.