Weight Agnostic Neural Networks

#artificialintelligence 

In this work we introduced a method to search for simple neural network architectures with strong inductive biases for performing a given task. Since the networks are optimized to perform well using a single weight parameter over a range of values, this single parameter can easily be tuned to increase performance. Individual weight values can then be further tuned as offsets from the best shared weight. The ability to quickly fine-tune weights is useful in few-shot learning and may find uses in continual lifelong learning where agents continually acquire, fine-tune, and transfer skills throughout their lifespan . Early works connected the evolution of weight tolerant networks to the Baldwin effect .

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found