Artificial neural networks learn better when they spend time not learning at all: Periods off-line during training mitigated 'catastrophic forgetting' in computing systems
"The brain is very busy when we sleep, repeating what we have learned during the day," said Maxim Bazhenov, PhD, professor of medicine and a sleep researcher at University of California San Diego School of Medicine. "Sleep helps reorganize memories and presents them in the most efficient way." In previous published work, Bazhenov and colleagues have reported how sleep builds rational memory, the ability to remember arbitrary or indirect associations between objects, people or events, and protects against forgetting old memories. Artificial neural networks leverage the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media. In some ways, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect: When artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon called catastrophic forgetting.
Nov-19-2022, 00:43:29 GMT
- Country:
- North America > United States > California > San Diego County > San Diego (0.28)
- Genre:
- Research Report > New Finding (0.53)
- Industry:
- Health & Medicine > Therapeutic Area (0.58)
- Technology: