Review for NeurIPS paper: Meta-Learning with Adaptive Hyperparameters

Neural Information Processing Systems 

Summary and Contributions: Updated review: After reading the rebuttal, I believe this is a robust empirical paper and worth publishing for the following reasons: - The results are very strong and surprising, challenging emerging hypotheses about generalisation even for image datasets, which are very well studied. This paper convincingly shows that betting on meta-learning adaptation does indeed generalise better than packaging non-adaptive priors via pre-training. This was not at all clear before this paper; indeed, two reviewers needed some convincing to believe it. It's on them that such confusion arose in the first place. Same goes for the title. Priors needed for SOTA generalisation on these well studied datasets can be "packed" into the learning rule itself via meta-learning.