You're sitting in the bleachers and the roar of powerful engines starts rising in the distance; seconds later the pack of Ferrari race cars speed past and you crane your neck around trying to see what position your favourite driver is in. That's the typical experience of car racing fans today, but if a partnership between Intel Corp. and Ferrari Motor Sports is a success, it might be much different tomorrow. A new system that involves artificial intelligence and a fleet of drones shooting video was showcased by Intel at CES 2018 booth this year. Not only could it change the fan experience for auto racing, it's also providing Ferrari drivers more insight into their performance. Intel CEO announced the three-year partnership on stage during his keynote.
If you follow the AI world, you've probably heard about AlphaGo. The ancient Chinese game of Go was once thought impossible for machines to play. It has more board positions () than there are atoms in the universe. The top grandmasters regularly trounced the best computer Go programs with absurd (10 or 15 stone!) handicaps, justifying their decisions in terms of abstract strategic concepts – joseki, fuseki, sente, tenuki, balance – that they believed computers would never be able to learn. And they spent three years painstaking years trying to prove this belief; collecting Go data from expert databases, tuning deep neural network architectures, and developing hybrid strategies honed against people as well as machines.
Recently Google DeepMind program AlphaGo Zero achieved superhuman level without any help - entirely by self-play! Here is the Nature paper explaining technical details (also PDF version: Mastering the Game of Go without Human Knowledge) One of the main reasons for success was the use of a novel form of Reinforcement learning in which AlphaGo learned by playing itself. The system starts with a neural net that does not know anything about Go. It plays millions of games against itself and tuned the neural network to predict next move and the eventual winner of the games. The updated neural network was merged with the Monte Carlo Tree Search algorithm to create a new and stronger version of AlphaGo Zero, and the process resumed.
Despite the rule "never start with a disclaimer", let me do exactly this: there is no universally accepted definition of intelligence, nor an always-applicable test for it. However, in the area of artificial intelligence (AI) we are confronted with the task to define the level of performance of an AI. Due to the fact that AI bases on computers, we could use the typical parameters, such as availability, response time, throughput, floating point operations per second, instructions per second, etc. Although such parameters are important for an AI computer too, they do not represent the adequate framework to conclude on an AI's intelligence. The reason for this is that a slow computer with smart algorithms seems to be "more intelligent" than a fast computer with circuitous ones.
Nintendo has sold a lot of Switches in the last year thanks to the console's unique ability to play games on a TV and on the go, but also thanks to The Legend of Zelda: Breath of the Wild and Super Mario Odyssey. Though they came from 30 year-old franchises, both games helped millions fall in love with them all over again. In 2018, Nintendo is setting its sights in a direction it hasn't aimed at before: the do-it yourself crowd. Nintendo Labo are a series of experiences for Switch that let you (or your kids) build cardboard objects and play games with them. Robots, fishing poles, pianos... there's a lot to build and try here.
At KDnuggets, we try to keep our finger on the pulse of main events and developments in industry, academia, and technology. We also do our best to look forward to key trends on the horizon. To close out 2017, we recently asked some of the leading experts in Big Data, Data Science, Artificial Intelligence, and Machine Learning for their opinion on the most important developments of 2017 and key trends they 2018. This post considers what happened in Machine Learning & Artificial Intelligence this year, and what may be on the horizon for 2018. "What were the main machine learning & artificial intelligence related developments in 2017, and what key trends do you see in 2018?"
The Boston public access station WGBH has partnered with PBS for another short series in its long-running Nova family of programs. Nova Wonder will follow three researchers exploring big scientific mysteries. Each episode tackles a different complex question: Do animals have a secret language? Which AI technologies could surpass human abilities? How ethical is it to grow life in a lab?
Duncan Jones' next movie won't be coming to theaters -- it's going straight to streaming. The Moon and Warcraft director has revealed that his long-in-the-making sci-fi film noire, Mute, will premiere on Netflix February 23rd. The movie is set in a future Berlin where a mute bartender (played by Alexander Skarsgård) has to trust a pair of American surgeons (led by Paul Rudd) as he tracks down a disappeared woman. There's no trailer yet, but in many ways the effort taken to release the movie is the hook -- Netflix is giving Jones a chance that might not have come up through conventional formats. As Jones noted, Mute is his "Don Quixote."
What happens when you take a perfectly good neural network and, figuratively, stick a screwdriver in its brain? You get melancholy glitch-art music videos that turn talking heads into digital puppets. A machine learning developer named Jeff Zito made a series of music videos using a deep learning network based on Face2Face. Originally developed to generate stunningly realistic image transfers, like controlling a digital Obama in real-time using your own facial movements, this project takes it in a different direction. Sometimes the best AI isn't good enough.