If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Earlier this month a self-driving shuttle in Las Vegas patiently waited as a delivery truck backed up, then backed up some more, then backed right into it. Inconveniently for the roboshuttle's developer Navya, this happened within hours of the shuttle's inauguration ceremony. The real problem is that the shuttle can't learn from the incident the way a human would: immediately and without forgetting how to do everything else in the process. The U.S. Defense Advanced Research Projects Agency (DARPA) is looking to change the way AI works through a program it calls L2M, or Lifelong Learning Machines. The agency is looking for systems that learn continuously, adapt to new tasks, and know what to learn and when.
Based on conversations we've had with iRobot CEO Colin Angle, we're expecting that within the next six months or so, robot vacuums will be able to understand our homes on a much more sophisticated and useful level than ever before. Specifically, they'll be able to generate maps that persist between cleaning sessions, and these maps will allow the robots to identify and remember specific rooms and adjust their cleaning behavior accordingly. For example, if your robot vacuum knows where your kitchen is, it can respond to commands like "Go clean the kitchen," or autonomously clean there as often as it needs to. At IROS in September, we got a bit of a sneak peak into how iRobot is going to make this happen, and how much of a difference it can make to the speed and efficiency of home navigation. It's a big difference, and it can even work on your older (and affordable) Roomba that only has bump sensors on it.
The deadliest animal on Earth, by far, is the mosquito. Millions of people die annually from mosquito-borne illnesses, and many of those diseases can't be cured with drugs. It's best to avoid being bitten in the first place, but this is becoming more difficult as the insects expand their range, migrating north with warming climates. For decades, government agencies and nonprofit organizations have tried to prevent the spread of mosquito-borne diseases in developing countries by spraying large areas with insecticides. But that process is expensive, especially as mosquitoes develop resistance to commonly used chemicals.
Stanford researchers have developed a machine-learning algorithm that can diagnose pneumonia from a chest x-ray better than a human radiologist can. And it learned how to do so in just about a month. The Machine Learning Group, led by Stanford adjunct professor Andrew Ng, was inspired by a data set released by the National Institutes of Health on 26 September. The data set contains 112,120 chest X-ray images labeled with 14 different possible diagnoses, along with some preliminary algorithms. The researchers asked four Stanford radiologists to annotate 420 of the images for possible indications of pneumonia.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. With a title like "What's new, Atlas?" for a video like this, you know that Boston Dynamics is just messing with us now: The game played out on a real set wherein Cozmo would roll through a series of trials, exploring rooms and solving puzzles which tested his ability to move, to place, stack, and turn blocks, and recognize faces and pets, testing Reddit's collective will to help him. Cozmo's quest: to gather three golden key cubes to be able to escape to Reddit's front page.
It is with much rejoicing that today we can share that one of our favorite robotics startups, Dash Robotics, is acquiring another of our favorite robotics startups, Bots Alive. Usually, we don't cover acquisitions, or when we do, it's with resigned skepticism--all too often, one company gets completely swallowed by another, and the things that made them unique and exciting simply vanish. The sense that we get from talking with Dash Robotics' CEO Nick Kohut and Bots Alive founder Brad Knox is that the amazing things that Bots Alive does fit right in with the equally amazing but totally different things that Dash Robotics does, and that together, they'll be able to come up with some totally cool (and totally affordable) robotic toys with sophisticated personalities built right in. Part of the reason that we're fans of Dash Robotics and Bots Alive is that they're both successful examples of taking robotics research and turning it directly into a compelling product. Dash Robotics turned UC Berkeley's DASH pop-up hexapod robot into a skittery and blisteringly fast toy called Kamigami that's now being sold in partnership with Mattel for US $50, while Bots Alive's software runs on your phone and gives a $20 Hexbug more brains and personality than an enthusiastic and mildly well trained puppy.
The tech industry can't hide from the information war, particularly when its own creations are being weaponized. That was the consensus of a panel at the Techonomy17 conference in Half Moon Bay, Calif., last week. The group assembled to discuss the meaning of authority in a networked, artificially intelligent world. The panelists quickly zoomed in on the manipulation of Facebook, Google, and other sites by Russians during the U.S. presidential election. They, as well as several other speakers at the conference, painted a dark picture of our current online world for at least the immediate future; they concluded that preventing such manipulation is not going to be easy.
This week, the first meeting of the Convention on Conventional Weapons (CCW) Group of Governmental Experts on lethal autonomous weapons systems is taking place at the United Nations in Geneva. Organizations like the Campaign to Stop Killer Robots are encouraging the UN to move forward on international regulation of autonomous weapons, which is great, because talking about how these issues will shape the future of robotics and society is a very important thing. Over the weekend, however, I came across a video that struck me as a disturbing contribution to the autonomous weapons debate. The video, called "Slaughterbots" and produced with support from Elon Musk's Future of Life Institute, combines graphic violence with just enough technical plausibility to imagine a very bleak scenario: A fictional near future in which autonomous explosive-carrying microdrones are killing thousands of people around the world. We are not going to embed the video here because it contains a number of violent scenes, including a terrorist attack in a classroom (!).
Today, a Chinese manufacturer and a venture backed by the Bill & Melinda Gates Foundation will announce plans to commercialize a microscope that uses deep learning algorithms to automatically identify and count malaria parasites in a blood smear within 20 minutes. AI-powered microscopes could speed up diagnosis and standardize detection of malaria at a time when the mosquito-borne disease kills almost half a million people per year. An experimental version of the AI-powered microscope has already shown that it can detect malaria parasites well enough to meet the highest World Health Organization microscopy standard, known as competence level 1. That rating means that it performs on par with well-trained microscopists, although the researchers note that some expert microscopists can still outperform the automated system. That previous research, presented at the International Conference on Computer Vision [pdf] in October, has inspired the Global Good Fund--a partnership between the company Intellectual Ventures and the Bill & Melinda Gates Foundation--and a Chinese microscope manufacturer called Motic to take the next big commercialization step.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. Takahiro Nozaki and colleagues of the Faculty of Science and Technology and Haptics Research Center at Keio University developed a haptic-based avatar-robot with a General Purpose Arm (GPA) that transmits sound, vision, movement, and importantly, highly sensitive sense of touch (force tactile transmission), to a remotely located user in real time. "This'real-haptics' is an integral part of the Internet of Actions (IoA) technology, having applications in manufacturing, agriculture, medicine, and nursing care," says Nozaki.