Reviews: Meta Architecture Search
–Neural Information Processing Systems
The authors propose Bayesian Meta Architecture Search (BASE), a method for meta learning neural network architectures and their weights across tasks. The paper frames this problem as an Bayesian inference problem and employs Gumbel-Softmax, reparametrization and optimization embedding, a variation inference method, to optimize a distribution over neural network architectures and their weights across different tasks. Originality: Meta learning neural network architectures is a very natural next step for NAS research, which as not been done so far (at least I'm not aware of any work). It is not only very natural but also very important as it allows to make NAS more scalable and of more practical relevance. The Bayesian view, however, is not really novel, but rather an obvious extension of [1]. In general, the related work section is very short and does not provide a proper summary of the current state of the art in this field of research Quality: BASE is well motivated and derived.
Neural Information Processing Systems
Feb-4-2025, 17:01:30 GMT