Following the March 2011 earthquake and tsunami that crippled Japan's Fukushima nuclear plant, Honda reportedly received numerous requests to send its humanoid robot ASIMO to help with the recovery. ASIMO, however, wasn't designed to work outside a lab or office environment, let alone a highly radioactive rubble-strewn zone. Now it looks like Honda is working to address the problem by developing a bigger, beefed-up version of ASIMO that can walk, crawl, and perform tasks in a disaster environment. After the Fukushima accident, many observers were surprised that Japan, a country known for its advanced robots, wasn't better prepared and had to rely on U.S. robots instead. In the months that followed, Japanese government agencies and companies got to work to develop capable disaster-response robots.
The disaster response team of the future could be made up of an army of cheery orange robots, according to Honda. The company has unveiled a working prototype of its E2-DR disaster response robot -- first revealed in an R&D paper in 2015 -- and it can do a lot. At 1.68 meters high and weighing in at 85 kilograms, E2-DR can walk, step over objects, climb stairs and ladders, slink through narrow spaces and traverse piles of debris. It can even tolerate rain for 20 straight minutes, which is more than a lot of actual people can. To keep E2-DR's size and weight to a minimum, Honda swapped out traditional cables for rigorously-tested optical fibers.
Yesterday, NHK (the Japan Broadcasting Corporation) reported that Honda has decided to cancel further development of its flagship humanoid robot, Asimo. A Honda representative who spoke with AFP said, "We will still continue research into humanoid robots, but our future robots may not be named Asimo. We have obtained lots of technologies while developing Asimo, and how to utilize them is one issue." It's not like Honda is abandoning robotics completely, or even abandoning the idea of humanoid robots. Instead, it sounds like the company want to start focusing on how to apply the technology that it has to make robots that don't just promote its brand, but actually help out with things like elder care and disaster relief.
Two years ago at IROS 2015 in Germany, Honda R&D presented a paper on an experimental new humanoid robot designed for disaster response. This wasn't entirely surprising, since we'd guessed that Honda had started working on a humanoid designed to be more robust, and practical, than Asimo after the Fukushima disaster. But as with most large Japanese companies, Honda does an excellent job of (almost) never communicating about the projects that it has under development. Pretty much the only sneak peeks we ever get come from research papers, and last week at IROS 2017 in Vancouver, we got the biggest look inside Honda's humanoid robotics research and development program that we've had in years. In a paper entitled "Development of Experimental Legged Robot for Inspection and Disaster Response in Plants," roboticists from Honda R&D showed off the latest prototype of their disaster relief robot, the E2-DR.
Artificial intelligence (AI) that reasons like a human remains elusive, but Honda hopes to make inroads. The Tokyo company's U.S.-based Research Institute today announced a collaboration with three academic institutions -- the Massachusetts Institute of Technology (MIT), the University of Pennsylvania (Penn), and the University of Washington -- to advance the field of artificial cognition. MIT's Computer Science and Artificial Intelligence (CSAIL) lab, in partnership with Penn's School of Engineering and Applied Science and the University of Washington's Paul G. Allen School of Computer Science & Engineering, will develop prototypes, working examples, and demonstrations of what Honda calls the "mechanisms of curiosity." Specifically, MIT CSAIL will focus its efforts on systems capable of predicting future percepts -- concepts developed as a consequence of perception -- and the effect of future actions, while Penn's engineering department and the Paul G. Allen School will develop perception models informed by biology and robots that can work safely in human environments. Grants will fund the first leg of research.