Reviews: Scalable Hyperparameter Transfer Learning
–Neural Information Processing Systems
This paper proposes a novel Bayesian Optimization approach that is able to do transfer learning across tasks while remaining scalable. Originality: This is very original work. Bayesian Optimization can work with any probabilistic regression algorithm, so the use of Bayesian linear regression to make it more scalable is well-known, as are its limitations (e.g. it doesn't extrapolate well). The main novelty here lies in the extension to multi-task learning, which allows it to benefit from prior evaluations on previous tasks. When such evaluations are available, this can provide a significant advantage.
Neural Information Processing Systems
Oct-7-2024, 05:37:15 GMT
- Technology: