A small robot is here to help after a mishap at a major nuclear waste site


A small robot is roving around a massive U.S. nuclear waste site to gather critical samples of potential air and water contamination after an emergency was declared Tuesday. The machine was deployed after a tunnel that stores rail cars filled with radioactive waste partially collapsed at Hanford Nuclear Reservation in Washington state. The mishap raised fears of a radiation leak at the nation's most contaminated nuclear site, though officials said there was no actual indication of a release of plutonium radiation as of 2:20 p.m. PDT. The air- and soil-sampling robot is monitoring for any changes on the scene. This robot is being used at Hanford right now to sample contamination in the air and on the ground.

How do Systems Manage Their Adaptive Capacity to Successfully Handle Disruptions? A Resilience Engineering Perspective

AAAI Conferences

A large body of research describes the importance of adaptability for systems to be resilient in the face of disruptions. However, adaptive processes can be fallible, either because systems fail to adapt in situations requiring new ways of functioning, or because the adaptations themselves produce undesired consequences. A central question is then: how can systems better manage their capacity to adapt to perturbations, and constitute intelligent adaptive systems? Based on studies conducted in different high-risk domains (healthcare, mission control, military operations, urban firefighting), we have identified three basic patterns of adaptive failures or traps: (1) decompensation – when a system exhausts its capacity to adapt as disturbances and challenges cascade; (2) working at cross-purposes – when sub-systems or roles exhibit behaviors that are locally adaptive but globally maladaptive; (3) getting stuck in outdated behaviors – when a system over-relies on past successes although conditions of operation change. The identification of such basic patterns then suggests ways in which a work organization, as an example of a complex adaptive system, needs to behave in order to see and avoid or recognize and escape the corresponding failures. The paper will present how expert practitioners exhibit such resilient behaviors in high-risk situations, and how adverse events can occur when systems fail to do so. We will also explore how various efforts in research related to complex adaptive systems provide fruitful directions to advance both the necessary theoretical work and the development of concrete solutions for improving systems’ resilience.