Goto

Collaborating Authors

A Computer Binge-Watched TV And Learned To Predict What Happens Next

NPR Technology

Examples of the machine's forecasts of actions before they begin (starting at left) and the results (at right). Examples of the machine's forecasts of actions before they begin (starting at left) and the results (at right). You watch hundreds of hours of television, they call you a lazy slob. A computer does it, and it's a technological success story. That is the case for a new algorithm from MIT's Computer Science and Artificial Intelligence Laboratory.


MIT Creates AI Able to See Two Seconds Into the Future

#artificialintelligence

When we see two people meet, we can often predict what happens next: a handshake, a hug, or maybe even a kiss. Our ability to anticipate actions is thanks to intuitions born out of a lifetime of experiences. Machines, on the other hand, have trouble making use of complex knowledge like that. Computer systems that predict actions would open up new possibilities ranging from robots that can better navigate human environments, to emergency response systems that predict falls, to Google Glass-style headsets that feed you suggestions for what to do in different situations. This week researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have made an important new breakthrough in predictive vision, developing an algorithm that can anticipate interactions more accurately than ever before.


Teaching machines to predict the future

#artificialintelligence

When we see two people meet, we can often predict what happens next: A handshake, a hug, or maybe even a kiss. Our ability to anticipate actions is thanks to intuitions born out of a lifetime of experiences. Machines, on the other hand, have trouble making use of complex knowledge like that. Computer systems that predict actions would open up new possibilities ranging from robots that can better navigate human environments, to emergency response systems that predict falls, to Google Glass-style headsets that feed you suggestions for what to do in different situations. This week researchers from MIT's Computer Science and Artificial Intelligence Laboratory(CSAIL) have made an important new breakthrough in predictive vision, developing an algorithm that can anticipate interactions more accurately than ever before.


'Big Bang Theory,' 'The Office' help couch-potato robots predict the future: MIT

#artificialintelligence

Remember the Jetsons' robot maid, Rosie? Massachusetts Institute of Technology researchers think her future real-life incarnations can learn a thing or two from Steve Carell and other sitcom stars. MIT says a computer that binge-watched YouTube videos and TV shows such as The Office, Big Bang Theory and Desperate Housewives learned how to predict whether the actors were about to hug, kiss, shake hands or slap high fives -- advances that eventually could help the next generation of artificial intelligence function less clumsily. "It could help a robot move more fluidly through your living space," lead researcher Carl Vondrick told The Associated Press in an interview. "The robot won't want to start pouring milk if it thinks you're about to pull the glass away."


Teaching machines to predict the future

#artificialintelligence

When we see two people meet, we can often predict what happens next: a handshake, a hug, or maybe even a kiss. Our ability to anticipate actions is thanks to intuitions born out of a lifetime of experiences. Machines, on the other hand, have trouble making use of complex knowledge like that. Computer systems that predict actions would open up new possibilities ranging from robots that can better navigate human environments, to emergency response systems that predict falls, to Google Glass-style headsets that feed you suggestions for what to do in different situations. This week researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have made an important new breakthrough in predictive vision, developing an algorithm that can anticipate interactions more accurately than ever before.