Nuclear


David Carpenter: Purpose driven to the core

MIT News

When he first reported to MIT's Nuclear Reactor Laboratory (NRL) as an undergraduate in 2002, David Carpenter anticipated a challenging research opportunity. After 15 years at the NRL conducting research and earning degrees in nuclear science and engineering, Carpenter's appetite for scientific discovery remains sharp, as does his commitment to improving both the performance and safety of current and next-generation nuclear reactors. "The design is intrinsically safe because the fuel doesn't melt, and the salt can withstand high temperatures without requiring thick, pressurized containment buildings," he says. The challenges to designing this new kind of reactor involve finding optimal construction materials, since super-hot radioactive salt is highly corrosive.


Dr Rustam Stolkin and robots that learn: Nuclear robotics meets machine learning

Robohub

To find out more, we talked to Dr Rustam Stolkin, Royal Society Industry Fellow for Nuclear Robotics, Professor of Robotics at the University of Birmingham, and Director at A.R.M Robotics Ltd, about his work combining machine learning and robotics to create practical solutions to nuclear problems. My work focuses on developing advanced robotics technologies for nuclear decommissioning, demolishing legacy nuclear sites, and safely packaging, storing and monitoring any radiological or contaminated materials. They have an enormous diversity of scenes, materials, and objects that nuclear decommissioning robots must interact with in complex ways, such as by grasping and cutting objects. Using robots also reduces "secondary nuclear waste"; for every one container filled with actual primary nuclear waste, more than ten containers become filled with contaminated plastic suits, respirators, rubber gloves, and other "secondary waste" from human entries.


The Risk and Reward of Artificial Intelligence

#artificialintelligence

In studying the Three Mile Island nuclear accident, Yale sociologist Charles Perrow concluded that conventional engineering approaches to ensuring safety – building-in more warnings and safeguards – will always fail in the face of increasing system complexity. He called the nuclear accident a "normal accident." Similarly, the Chernobyl accident in 1986, the Space Shuttle Columbia disaster in 2003, the 2008 financial crisis and the Fukushima Daiichi nuclear disaster in 2011 are, in fact, perfectly normal. We just don't know when or how a black swan will show up.


Renault-Nissan developing a fleet of self-driving EVs

Engadget

Luckily, the Renault-Nissan Alliance has teamed with a company called Transdev to develop a fleet of self-driving vehicles "for future public and on-demand transportation," it said in a press release. The project will kick off with autonomous field testing of Europe's most popular EV, the 250-mile-range Renault Zoe. The vehicles will be tested initially at Paris-Saclay, a public and private research campus and university south of Paris. With the open spaces and access to research facilities, university campuses have been popular spots for autonomous cars -- the University of Michigan even created a fake city to test them.


Artificial Intelligence is shaping the future of Energy - Open Energi

#artificialintelligence

In the UK, large fossil fuelled power stations are being replaced by increasing levels of widely distributed wind and solar generation. We have spent the last 6 years working with some of the UK's leading companies to manage their flexible demand in real-time and help balance electricity supply and demand UK-wide. Using artificial intelligence and machine learning means we can find creative ways to reschedule the power consumption of many assets in synchrony, helping National Grid to balance the system while minimising the cost of consuming that power for energy users. Artificial Intelligence can help us to unlock this demand-side flexibility and build an electricity system fit for the future; one which cuts consumer bills, integrates renewable energy efficiently, and secures our energy supplies for generations to come.


Modest Debut of Atlas May Foreshadow Age of 'Robo Sapiens'

AITopics Original Links

Walking on two legs, they have the potential to serve as department store guides, assist the elderly with daily tasks or carry out nuclear power plant rescue operations. "Two weeks ago 19 brave firefighters lost their lives," said Gill Pratt, a program manager at the Defense Advanced Research Projects Agency, part of the Pentagon, which oversaw Atlas's design and financing. Even before the Fukushima disaster, Marvin Minsky, a pioneer in artificial intelligence research, castigated the nuclear power agency for being unprepared for disasters. It is the Defense Advanced Research Projects Agency, not the Defense Advanced Projects Agency.


Are Japanese robots losing their edge to Silicon Valley?

AITopics Original Links

Sebastian Thrun, the former Stanford University professor who helped to develop Google's driverless car, questions whether there will ever be a big market, though he says: "I think it's fascinating to see how people react to it and tease out deeper truths about humanity." The US is "by far the world leader" at the moment in autonomous robots, according to Melonee Wise, chief executive of Fetch Robotics, a Silicon Valley start-up. These include Preferred Networks, a machine learning company which is working to apply AI to robots and autonomous driving systems and has teamed up with powerful manufacturers such as Fanuc and Panasonic. "It takes effort to link hardware, software, and network technologies," said Toru Nishikawa, Preferred Networks' chief executive.




Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets

#artificialintelligence

We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected "signal"; (5) using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation.