Not enough data to create a plot.
Try a different view from the menu above.
A small robot is roving around a massive U.S. nuclear waste site to gather critical samples of potential air and water contamination after an emergency was declared Tuesday. The machine was deployed after a tunnel that stores rail cars filled with radioactive waste partially collapsed at Hanford Nuclear Reservation in Washington state. The mishap raised fears of a radiation leak at the nation's most contaminated nuclear site, though officials said there was no actual indication of a release of plutonium radiation as of 2:20 p.m. PDT. The air- and soil-sampling robot is monitoring for any changes on the scene. This robot is being used at Hanford right now to sample contamination in the air and on the ground.
Few people ever need to deal with a stricken nuclear reactor, but that skill could turn out to be important for the evolution of smarter robots. In Pomona, California, this week, 25 of the world's most advanced humanoid robots will take part in a contest inspired by the challenge of stabilizing a nuclear reactor that's leaking dangerous radioactive material. Teams from universities across the U.S., as well as Japan, China, and Europe, are bringing robots that will try to walk across piles of rubble, climb ladders, operate power tools, and drive buggies, among other chores. Each challenge is inspired by something that might have helped stabilize the Fukushima Daiichi nuclear plant in Japan after it was damaged by an earthquake in 2011. Considerable academic kudos will go to whichever team completes the most tasks within the allotted time by the end of the contest.
How do you bring a bad drone down? New kinds of drones that can fly autonomously can't be stopped with traditional techniques, the US Air Force has warned. It's put out a call for ideas to yank drones right out of the sky. Millions of drones are sold worldwide each year. Most are flown for fun, but a few have been put to criminal use: carrying cameras to bedroom windows, flying into secure airspace over nuclear power stations, and smuggling contraband into prisons.
Lawrence Livermore National Laboratory (LLNL) has purchased a new brain-inspired supercomputing platform developed by International Business Machines Corp (NYSE:IBM). Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses while consuming only the energy equivalent of a tablet computer. The brain-like, neural network design of the IBM neuromorphic system is able to run complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips. LLNL will receive a 16-chip TrueNorth system representing a total of 16 million neurons and 4 billion synapses. The new system will be used to explore new computing capabilities important to the National Nuclear Security Administration (NNSA) missions in cybersecurity, stewardship of the nation's nuclear weapons stockpile and nonproliferation.
Lawrence Livermore National Laboratory (LLNL) today announced it will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips. The new system will be used to explore new computing capabilities important to the National Nuclear Security Administration's (NNSA) missions in cybersecurity, stewardship of the nation's nuclear weapons stockpile and nonproliferation. NNSA's Advanced Simulation and Computing (ASC) program will evaluate machine-learning applications, deep-learning algorithms and architectures and conduct general computing feasibility studies.
A new low-power, "brain-inspired" supercomputing platform based on IBM chip technology will soon start exploring deep learning for the U.S. nuclear program. Lawrence Livermore National Laboratory announced on Tuesday that it has purchased the platform, based on the TrueNorth neurosynaptic chip IBM introduced in 2014. It will use the technology to evaluate machine-learning and deep-learning applications for the National Nuclear Security Administration. The computer will process data with the equivalent of 16 million neurons and 4 billion synapses and consume roughly as much energy as a tablet PC. Also included will be an accompanying ecosystem consisting of a simulator; a programming language; an integrated programming environment; a library of algorithms and applications; firmware; tools for composing neural networks for deep learning; a teaching curriculum; and cloud enablement.
Atlas, the humanoid robot created by Alphabet (GOOGL, Tech30) company Boston Dynamics, can open doors, balance while walking through the snow, place objects on a shelf and pick itself up after being knocked down. The new version of Atlas is smaller and more nimble than its predecessor. It's fully mobile too -- the previous version had to be tethered to a computer. Atlas was created to perform disaster recovery in places unsafe for humans, such as damaged nuclear power plants. The robot made its debut in 2013 during a competition held by the Defense Advanced Research Projects Agency.
Real-time domains present a new and challenging environment for the application of knowledge-based problem-solving techniques. However, a substantial amount of research is still needed to solve many difficult problems before real-time expert systems can enhance current monitoring and control systems. In this article, we examine how the real-time problem domain is significantly different from those domains which have traditionally been solved by expert systems. We conduct a survey on the current state of the art in applying knowledge-based systems to real-time problems and describe the key issues that are pertinent in a real-time domain. The survey is divided into three areas: applications, tools, and theoretic issues. From the results of the survey, we identify a set of real-time research issues that have yet to be solved and point out limitations of current tools for real-time problems. Finally, we propose a set of requirements that a real-time knowledge-based system must satisfy.