In March of last year, Google's (Menlo Park, California) artificial intelligence (AI) computer program AlphaGo beat the best Go player in the world, 18-time champion Lee Se-dol, in a tournament, winning 4 of 5 games.1 At first glance this news would seem of little interest to a pathologist, or to anyone else for that matter. After all, many will remember that IBM's (Armonk, New York) computer program Deep Blue beat Garry Kasparov--at the time the greatest chess player in the world--and that was 19 years ago. The rules of the several-thousand-year-old game of Go are extremely simple. The board consists of 19 horizontal and 19 vertical black lines.
Until recently, artificial intelligence (AI) was primarily limited to computer chess players and jeopardy. In the last few years, however, the pace of innovation in AI has skyrocketed, driven by tipping points in algorithms, processing (GPUs), and increasing volumes of data. While there is an infinite set of use cases for AI, the Internet of Things is a particularly interesting breeding ground for new AI-driven solutions and experiences, from self-driving cars to intelligent homes to mHealth. In this talk at Bosch ConnectedWorld Chicago, MongoDB's Dev Ittycheria discusses how the massive increase in data driven by sensors will drive the next wave of innovation in AI.
Facebook has teamed up with researchers at the Massachusetts Institute of Technology (MIT) to develop an artificial intelligence (AI) assistant for the world's most popular video game, Minecraft. This isn't an AI that will help you automatically build worlds, but rather one capable of multitasking and helping users with everyday tasks outside a gaming environment. Minecraft was chosen by the researchers because it's currently the most popular game in the world with more than 90-million people playing it every month and it has "infinite variety" yet simple predictable rules. Plus, there's a big opportunity for the AI assistant to learn inside'Minecraft' and help human players to acquire more knowledge outside the game. "The opportunities for an AI to learn are huge, Facebook is setting itself the task of designing the AI to self-improve, the researchers think the'Minecraft' environment is a perfect one to develop this kind of learning," said the report.
Anyone who's experimented with a cloud gaming service knows that wired ethernet is almost required. At AT&T's Spark conference in San Francisco on Monday, I had a chance to try out Nvidia's GeForce Now service for PCs running over AT&T's 5G service, playing the newly-released Shadow of the Tomb Raider game on a generic Lenovo ThinkPad. The traditional way to run a PC game is locally, running the game off a hard drive or SSD on your PC, using the CPU and GPU to render the game as fast as it can. The downside, of course, is that you have to buy all of that hardware yourself. The trade-off is that the 3D rendering takes place on a remote server--a cheaper solution than buying a high-end graphics card, at least in the short term.
When IBM's Deep Blue computer won its first game of chess against world champion Garry Kasparov in 1996, the public got a real taste of how powerful computers had become in competing with human intelligence. Since then, not only has computing power grown exponentially but the cost of processing power has fallen dramatically. These trends, combined with advances in artificial intelligence algorithms have enabled the development of systems that can, in some instances, perform tasks better than human beings. Video surveillance is one of these tasks; and certainly there is a large market opportunity given there has been little increase in the ability to analyze video, despite the massive growth in surveillance and in the storage of video data. According to IHS, 127 million surveillance cameras and 400 thousand body-worn cameras will ship in 2017 - in addition to the estimated 300 million cameras already deployed - and approximately 2.5 billion exabytes of data will be created every day.