Are Weights Really Important to Neural Networks?
Architecture and weights are two essential considerations for artificial neural networks. Architecture is akin to the innate human brain, and contains the neural network's initial settings such as hyperparameters, layers, node connections (or wiring), etc. Weights meanwhile are the relative strength of the different connections between nodes after model training, which can be likened to a human brain that has learned for example how to multiply numbers or speak French. As with the age-old "nature versus nurture" debate, AI researchers want to know whether architecture or weights play the main role in the performance of neural networks. In a blow to the "nurture" side, Google researchers have now demonstrated that a neural network which has not learned weights through training can still achieve satisfactory results in machine learning tasks. Google Brain researchers Adam Gaier and David Ha said their idea was inspired by precocial behaviors that have evolved in nature, explaining in a blog post: "In biology, precocial species are those whose young already possess certain abilities from the moment of birth. There is evidence to show that lizard and snake hatchlings already possess behaviors to escape from predators. Shortly after hatching, ducks are able to swim and eat on their own, and turkeys can visually recognize predators."
Jun-14-2019, 15:57:41 GMT