New computational algorithms make it possible to build neural networks with many input nodes and many layers, and distinguish "deep learning" of these networks from previous work on artificial neural nets.
Earlier this year, Lisa Halliday published her first novel, Asymmetry. The book, which has been rightly raved about, consists of two seemingly unrelated novellas and a coda that elegantly and gently ties them together. The second part of the novel is the first-person account of an Iraqi American, who has been detained in customs at Heathrow airport. It's excellent, pulling this reader in even as she groused that the first part was over. Because that first part of the novel is a kind of high-water mark of literary delectability, the story of a witty and lovely May-December romance, begun over Mister Softee, and including baseball games, sex, the evaporation of sex, and many a Searle coat, all shot through with the thrilling reading great literary gossip.
Coping with harsh conditions, rather than social challenges, was chiefly responsible for boosting the size of our brains, a new study has found. The research found'ecological' challenges like finding food and lighting fires boosted the capacity of our ancestors to think ahead. The finding may settle a decades-long debate on the origins of human intelligence and our social relationships, scientists said. The human brain got so big because life was tough on the African savannah around two million years ago, according to new research. The human brain has tripled in size compared to the white matter of our ancestor Australopithecus afarensis, which roamed the Earth more than 3 million years ago.
"I have a fairly strong opinions about the utility of trying to divide social and ecological challenges into mutually exclusive categories," she said. "After reading the paper several times, I am still convinced that the model and results are more consistent with a'socio-ecological' brain. Their argument that they find support for an ecological and not social brain just isn't supported by their results."
A team of researchers in Japan has managed to recreate a tiny portion of a human brain, piece by piece, with more precision that ever before. The team connected networks of neurons, the pathways along which information travels through our brains, with remarkable accuracy. It could be the first step toward the creation of brains in the lab that mirror our own - although millions of connecting neurons are needed to perform even basic tasks. Networks of neurons, the pathways along which information travels through our brains, have been created in the lab with more precision than ever before. Researchers from the University of Tokyo examined how neurons behave and found that they could be trained to join with one another using a'synthetic neuron-adhesive material', what they call their microscopic plates.
Institute Professor Ann Graybiel, a professor in the Department of Brain and Cognitive Sciences and member of MIT's McGovern Institute for Brain Research, is being recognized by the Gruber Foundation for her work on the structure, organization, and function of the once-mysterious basal ganglia. She was awarded the prize alongside Okihide Hikosaka of the National Institute of Health's National Eye Institute and Wolfram Schultz of the University of Cambridge in the U.K. The basal ganglia have long been known to play a role in movement, and the work of Graybiel and others helped to extend their roles to cognition and emotion. Dysfunction in the basal ganglia has been linked to a host of disorders including Parkinson's disease, Huntington's disease, obsessive-compulsive disorder and attention-deficit hyperactivity disorder, and to depression and anxiety disorders. Graybiel's research focuses on the circuits thought to underlie these disorders, and on how these circuits act to help us form habits in everyday life. "We are delighted that Ann has been honored with the Gruber Neuroscience Prize," says Robert Desimone, director of the McGovern Institute.
Ancient human relatives Homo naledis might have had pint-sized brains, but when it came to hosting complex features, their tiny packages had almost everything that we see in the brains of modern-day humans. The theory comes from a new study exploring the skull fragments of the ancient species. The fragments had brain impressions, something that hinted despite being small, the brains of our ancient relatives had structure and shape similar to our own, which is three times bigger in terms of size. The skulls of Homo naledi bear traces that suggest the shapes and structure of their brains were very similar to that of Homo sapiens. Only a third the size of human brains, they nonetheless had some surprisingly human-like features.
Intel said this week that a system based on its Loihi chip planned for 2019 will include the equivalent of 100 billion synapses, which is about the same brain complexity as a common mouse. Last September, Intel introduced the world to Loihi, a chip designed for what Intel calls probabilistic computing. Intel sees probabilistic computing as an important step on the road to artificial intelligence. Unlike a Core chip, which uses a sequential pipeline of instructions, Loihi is designed to mimic the way the brain works. The version of the Loihi chip that Intel introduced last year included 130,000 silicon "neurons" connected with 130 million "synapses," the junctions that in humans connect the neurons within the brain.
If you have to walk a different route to the shops, it's normally not too much of a stretch to consult our'inner satnav' and chart a new course. That's because the human brain has a range of built-in mechanisms that help you find your way. But the underlying brain computation that goes into even simple navigation, such as planning the most direct route between points A and B, remains pretty murky. A team from Google DeepMind and University College London in the United Kingdom have trained a form of artificial intelligence to traverse a virtual environment from one point to another. The computer program, described in the journal Nature today, developed "neurons" similar to "grid cells", which are the brain cells found in mammals that bestow navigation skills.
The Boston Celtics routed the Cavaliers in Game 1 of the Eastern Conference Finals, but there was at least one brief moment of hope for Cleveland. A Jeff Green jumper at the end of the third quarter cut the Celtics' 28-point lead in half, and the the Cavs found themselves within punching distance … until Boston rattled off a 7-0 run to open the fourth and put the game away for good. LeBron remembers that burst all too well, as evidenced by the above video from the postgame press conference. Feel free to check his work, but he's pretty on the money. While LeBron's computer brain is certainly impressive, he does make one error.
As of Spring 2018 the fastest computer is the Sunway Taihulight, Wuxi – China. It has 10,649,600 processing cores, clustered in group of 260 each and delivering an overall performance of 125.44 PetaFLOPS (million of billions of instructions per second) requiring some 20MW of power. In the US the National Strategic Computing Initiative aims at developing the first exascale computer (8 times faster than the Sunway Taihulight computer) and the race is on against China, South Korea and Europe. We might be seeing the winner this year (next month the top 500 computers list will be revised -it happens twice a year). These supercomputers are used today in studying the Earth climate and earthquakes, simulating weapons effect, designing new drugs, simulating the folding of proteins.