Weight Agnostic Neural Networks
In this work we introduced a method to search for simple neural network architectures with strong inductive biases for performing a given task. Since the networks are optimized to perform well using a single weight parameter over a range of values, this single parameter can easily be tuned to increase performance. Individual weight values can then be further tuned as offsets from the best shared weight. The ability to quickly fine-tune weights is useful in few-shot learning and may find uses in continual lifelong learning where agents continually acquire, fine-tune, and transfer skills throughout their lifespan . Early works connected the evolution of weight tolerant networks to the Baldwin effect .
Jun-25-2019, 01:00:40 GMT
- Technology: