Large, expensive, computing-intensive research initiatives have historically promoted high-performance computing (HPC) in the wealthiest countries, most notably in the U.S., Europe, Japan, and China. The exponential impact of the Internet and of artificial intelligence (AI) has pushed HPC to a new level, affecting economies and societies worldwide. In Latin America, this was no different. Nevertheless, the use of HPC in science affected the countries in the region in a heterogeneous way. Since the first edition in 1993 of the TOP500 list of most powerful supercomputing systems in the world, only Mexico and Brazil (with 18 appearances each) made the list with research-oriented supercomputers.
Eight technologies developed by MIT Lincoln Laboratory researchers, either wholly or in collaboration with researchers from other organizations, were among the winners of the 2020 R&D 100 Awards. Annually since 1963, these international R&D awards recognize 100 technologies that a panel of expert judges selects as the most revolutionary of the past year. Six of the laboratory's winning technologies are software systems, a number of which take advantage of artificial intelligence techniques. The software technologies are solutions to difficulties inherent in analyzing large volumes of data and to problems in maintaining cybersecurity. Another technology is a process designed to assure secure fabrication of integrated circuits, and the eighth winner is an optical communications technology that may enable future space missions to transmit error-free data to Earth at significantly higher rates than currently possible.
On Nov 28th 2019, the EU parliament declared a global climate and environmental emergency. They say that all politics is local and across the world climate change seems to be coming home to roost. In the hills around San Francisco the bankrupt PG&E power company pre-emptively shutoff power to homes for several days as it worried that its ageing electrical equipment would act as a match to the parched trees and vegetation. In Europe extreme flooding has been immersing ancient towns in apocalyptic scenes. In Australia it was hard to discern the iconic Sydney Opera House for all the smoke from the raging bush fires.
The Department of Energy's first artificial intelligence director is currently reviewing more than 600 AI projects across its agencies to identify "critical" technologies worth advancing and replicating. Earlier this month, Cheryl Ingstad was named head of DOE's new Artificial Intelligence and Technology Office (AITO), intended to prioritize department resources for AI projects as the coordinating agency. The Trump administration proposed funding AITO at $5 million in fiscal 2021 -- up from $2.5 million the previous fiscal year -- but the office will be tapping into other agencies' funds as well. "They have program and project resources available," Ingstad told FedScoop in an interview. Energy has 17 national laboratories developing and applying AI to power generation, cybersecurity, national security, and accelerating scientific discoveries.
This paper surveys the field of transfer learning in the problem setting of Reinforcement Learning (RL). RL has been the key solution to sequential decision-making problems. Along with the fast advance of RL in various domains. including robotics and game-playing, transfer learning arises as an important technique to assist RL by leveraging and transferring external expertise to boost the learning process. In this survey, we review the central issues of transfer learning in the RL domain, providing a systematic categorization of its state-of-the-art techniques. We analyze their goals, methodologies, applications, and the RL frameworks under which these transfer learning techniques would be approachable. We discuss the relationship between transfer learning and other relevant topics from an RL perspective and also explore the potential challenges as well as future development directions for transfer learning in RL.
Non-Intrusive Load Monitoring (NILM) is a field of research focused on segregating constituent electrical loads in a system based only on their aggregated signal. Significant computational resources and research time are spent training models, often using as much data as possible, perhaps driven by the preconception that more data equates to more accurate models and better performing algorithms. When has enough prior training been done? When has a NILM algorithm encountered new, unseen data? This work applies the notion of Bayesian surprise to answer these questions which are important for both supervised and unsupervised algorithms. We quantify the degree of surprise between the predictive distribution (termed postdictive surprise), as well as the transitional probabilities (termed transitional surprise), before and after a window of observations. We compare the performance of several benchmark NILM algorithms supported by NILMTK, in order to establish a useful threshold on the two combined measures of surprise. We validate the use of transitional surprise by exploring the performance of a popular Hidden Markov Model as a function of surprise threshold. Finally, we explore the use of a surprise threshold as a regularization technique to avoid overfitting in cross-dataset performance. Although the generality of the specific surprise threshold discussed herein may be suspect without further testing, this work provides clear evidence that a point of diminishing returns of model performance with respect to dataset size exists. This has implications for future model development, dataset acquisition, as well as aiding in model flexibility during deployment.
Recently, tremendous interest has been devoted to develop data fusion strategies for energy efficiency in buildings, where various kinds of information can be processed. However, applying the appropriate data fusion strategy to design an efficient energy efficiency system is not straightforward; it requires a priori knowledge of existing fusion strategies, their applications and their properties. To this regard, seeking to provide the energy research community with a better understanding of data fusion strategies in building energy saving systems, their principles, advantages, and potential applications, this paper proposes an extensive survey of existing data fusion mechanisms deployed to reduce excessive consumption and promote sustainability. We investigate their conceptualizations, advantages, challenges and drawbacks, as well as performing a taxonomy of existing data fusion strategies and other contributing factors. Following, a comprehensive comparison of the state-of-the-art data fusion based energy efficiency frameworks is conducted using various parameters, including data fusion level, data fusion techniques, behavioral change influencer, behavioral change incentive, recorded data, platform architecture, IoT technology and application scenario. Moreover, a novel method for electrical appliance identification is proposed based on the fusion of 2D local texture descriptors, where 1D power signals are transformed into 2D space and treated as images. The empirical evaluation, conducted on three real datasets, shows promising performance, in which up to 99.68% accuracy and 99.52% F1 score have been attained. In addition, various open research challenges and future orientations to improve data fusion based energy efficiency ecosystems are explored.
Lightning was a factor in many of these fires. But past blazes, including the 2018 Camp Fire that destroyed the town of Paradise, Calif., were started by faulty transmission equipment. In that case, a worn piece of metal that holds power lines, known as a C-hook, broke and dropped a high-voltage electric line that ignited that fire. The Morning Download delivers daily insights and news on business technology from the CIO Journal team. In June, PG&E Corp., parent company of Pacific Gas and Electric Co., pleaded guilty to 84 counts of involuntary manslaughter for its role in sparking that fire.
To help make the process faster and cheaper, Ford recently sought the help of a four-legged robot dog made by Boston Dynamics, a subsidiary of Japanese conglomerate SoftBank Group Corp. "It's a huge breakthrough for us," said Mark Goderis, digital engineering manager at Ford. The robot, nicknamed "Fluffy" by one of Ford's digital engineers, weighs 70 pounds and is equipped with five cameras that give it 360-degree vision, letting it observe what's in front of it and avoid obstacles. It can climb stairs and stabilize itself on slippery surfaces and metal grates using optimization algorithms. It can also access hard-to-reach areas within the plant, as long as they are at least 2 feet wide. The robot dog, officially called "Spot" by Boston Dynamics, costs $74,500.
California is becoming a poster child for the risks utilities face from climate change, from power lines starting wildfires to heat waves forcing increasingly renewable-powered grids to the brink of system collapse. But utilities around the world are facing similar risks as they seek to decarbonize their generation fleets and make their grids more resilient to extreme weather events that are becoming more extreme and more common. While the costs of mitigating those risks are hard to quantify, they're likely much smaller than the costs of doing nothing and facing the alternatives. We're seeing this calculation reflected in many ways, from massive asset manager BlackRock's decision to move away from investments in coal and other global-warming-causing industries, to the maintenance and planning failures that led to the power-line-sparked wildfires that forced Pacific Gas & Electric into bankruptcy last year. Data -- the lifeblood of investors, insurers and other professional calculators of risk -- can help utilities better identify these climate-change challenges and optimize their methods to mitigate them.