How can we create robots that can carry out important tasks in dangerous environments? Machine learning is supporting advances in the field of robotics. To find out more, we talked to Dr Rustam Stolkin, Royal Society Industry Fellow for Nuclear Robotics, Professor of Robotics at the University of Birmingham, and Director at A.R.M Robotics Ltd, about his work combining machine learning and robotics to create practical solutions to nuclear problems.
In studying the Three Mile Island nuclear accident, Yale sociologist Charles Perrow concluded that conventional engineering approaches to ensuring safety – building-in more warnings and safeguards – will always fail in the face of increasing system complexity. He called the nuclear accident a "normal accident." Similarly, the Chernobyl accident in 1986, the Space Shuttle Columbia disaster in 2003, the 2008 financial crisis and the Fukushima Daiichi nuclear disaster in 2011 are, in fact, perfectly normal. We just don't know when or how a black swan will show up.
French people love to drive, but with private radar companies set to give out way more speeding tickets, they may be willing to let machines take the wheel. Luckily, the Renault-Nissan Alliance has teamed with a company called Transdev to develop a fleet of self-driving vehicles "for future public and on-demand transportation," it said in a press release. The project will kick off with autonomous field testing of Europe's most popular EV, the 250-mile-range Renault Zoe.
Across the globe, energy systems are changing, creating unprecedented challenges for the organisations tasked with ensuring the lights stay on. In the UK, large fossil fuelled power stations are being replaced by increasing levels of widely distributed wind and solar generation. This renewable power is clean and free at the point of use but it cannot always be relied upon. To date National Grid has managed this intermittency by keeping polluting power stations online to make up the difference but Artificial Intelligence offers an alternative approach.
"People love the wizards in Harry Potter or'Lord of the Rings,' but this is real," said Gary Bradski, a Silicon Valley artificial intelligence specialist and a co-founder of Industrial Perception Inc., a company that is building a robot able to load and unload trucks. "A new species, Robo sapiens, are emerging," he said.
Following the March 2011 earthquake and tsunami that crippled Japan's Fukushima nuclear plant, Honda reportedly received numerous requests to send its humanoid robot ASIMO to help with the recovery. ASIMO, however, wasn't designed to work outside a lab or office environment, let alone a highly radioactive rubble-strewn zone. Now it looks like Honda is working to address the problem by developing a bigger, beefed-up version of ASIMO that can walk, crawl, and perform tasks in a disaster environment.
We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected "signal"; (5) using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation.
It's been over five years since disaster struck at the Fukushima Daiichi Nuclear Power Plant, but fear of the event's long-term effects is still present, as is the memory of the faulty response on the part of government and corporate entities. Future nuclear incidents might be prevented by avoiding dangerous energy sources altogether; however, it is impossible to prevent other nonnuclear disasters from striking vulnerable populations. The WAREC-1 robot is designed to navigate a disaster area through unique movements.