The ability to forecast events at scale, given a set of variables, is something most companies would find useful. So Amazon is aiming to make prediction more accessible with a fully managed service called Forecast that uses AI and machine learning to deliver highly accurate forecasts. As Amazon explained in a press release, Forecast -- which is based on the same technology the Seattle company uses to anticipate demand for hundreds of millions of products every day -- can be used to build precise forecasts for virtually any business condition, including product demand and sales, infrastructure requirements, energy needs, and staffing levels. It automatically provisions the necessary cloud infrastructure and processes data, building custom AI models hosted on AWS without requiring an ounce of machine learning experience on the part of developers. Amazon says the API or a console allows the average person to build custom machine learning models in less than five clicks and achieve accuracy levels that would normally take months in as little as a few hours.
The'AI Apocalypse' might kill humanity before any actual robot uprising Education Images/Universal Images Group via Getty Images You can think of artificial intelligence (AI) in the same way you think about cloud computing, if you think about either of them through an environmental lens: an enormous and growing source of carbon emissions, with the very real potential to choke out humans' ability to breathe clean air long before a sentient and ornery AI goes all Skynet on us. At the moment, data centers--the enormous rooms full of stacks and stacks of servers that juggle dank memes, fire tweets, your vitally important Google docs and all the other data that is stored somewhere other than on your phone and in your home computer--use about 2% of the world's electricity. SEE ALSO: Can Giant Snow-Blowing Cannons Save Earth From Climate Change? Of that, servers that run AI--processing all the data and making the decisions and computations that a machine mimicking a human brain must handle in order to achieve "deep learning"--use about 0.1% of the world's electricity, according to a recent MIT Technology Review article. The likelihood that figure will grow, it turns out, is quite good.
Japan has told the United States it is ready to provide its robot technology for use in dismantling nuclear and uranium enrichment facilities in North Korea as Washington and Pyongyang pursue further denuclearization talks, government sources said Friday. As Japan turns to the remotely controlled robots it has developed to decommission reactors crippled by the triple core meltdown in 2011 at the Fukushima No. 1 power plant, it believes the same technology can be used in North Korea, according to the sources. The offer is part of Japan's efforts to make its own contribution to the denuclearization talks amid concern that Tokyo could be left out of the loop as the United States and North Korea step up diplomacy. Tokyo has already told Washington it would shoulder part of the costs of any International Atomic Energy Agency inspections of North Korean facilities and dispatch its own nuclear experts to help. The scrapping of nuclear facilities, such as the Yongbyon complex, which has a graphite-moderated reactor, will come into focus in forthcoming working-level talks between Washington and Pyongyang.
A team at the US National Renewable Energy Laboratory (NREL) is working on autonomous energy grid (AEG) technology to ensure the electricity grid of the future can manage a growing base of intelligent energy devices, variable renewable energy, and advanced controls. "The future grid will be much more distributed too complex to control with today's techniques and technologies," said Benjamin Kroposki, director of NREL's Power Systems Engineering Center. "We need a path to get there--to reach the potential of all these new technologies integrating into the power system." The AEG effort envisions a self-driving power system - a very "aware" network of technologies and distributed controls that work together to efficiently match bi-directional energy supply to energy demand. This is a hard pivot from today's system, in which centralized control is used to manage one-way electricity flows to consumers along power lines that spoke out from central generators.
Unlike task-specific algorithms, Deep Learning is a part of Machine Learning family based on learning data representations. With massive amounts of computational power, machines can now recognize objects and translate speech in real time, enabling a smart Artificial intelligence in systems. The concept of a software simulating the neocortex's large array of neurons in an artificial neural network is decades old, and it has led to as many disappointments as breakthroughs. But because of improvements in mathematical formulas and increasingly powerful computers, today researchers & data scientists can model many more layers of virtual neurons than ever before. "Recent improvements in Deep Learning has reignited some of the grand challenges in Artificial Intelligence."
Thanks to a new partnership with the Alan Turing Institute, National Grid Electricity System Operator (ESO) announced it has developed new AI prediction models that have improved solar forecasting by one-third. Knowing how much power will be flowing into the grid on any given day is becoming increasingly crucial as the proportion of intermittent renewable power serving the grid goes up. Rob Rome, commercial operations manager at the ESO, said the new forecast models means the power system can become much more efficient at managing supply and demand. Improved solar forecasts will help us run the system more efficiently, ultimately meaning lower bills for consumers. National Grid worked with researchers and doctoral students at the Institute to develop the improved forecasting models.
Microsoft and digital energy management and automation solution provider Schneider Electric have partnered to launch AI for Green Energy, a new accelerator programme for Microsoft's AI Factory. Through the programme, Microsoft and Schneider will help start-ups use artificial intelligence (AI) to transform the energy sector in Europe, decreasing consumption and increasing energy efficiency. These entrepreneurs will be able to learn from the technical and business expertise of the two companies during a three-month acceleration period. "We are delighted to leverage our ecosystem of partners to serve the most important causes to society, thanks to the start-ups of tomorrow," said Agnès Van de Walle, director of Microsoft's One Commercial Partner group. "Schneider Electric will bring in-depth expertise and personalised support, accelerating innovation across the energy sector."
The promise of big data and artificial intelligence is everywhere. And, in all cases, so are the results. One almost gets the impression that there is no problem that cannot be solved with these new technologies. The answer to everything is'big data and artificial intelligence'. Open a web browser and you see advertising tuned to your latest online shopping.
Management and efficient operations in critical infrastructure such as Smart Grids take huge advantage of accurate power load forecasting which, due to its nonlinear nature, remains a challenging task. Recently, deep learning has emerged in the machine learning field achieving impressive performance in a vast range of tasks, from image classification to machine translation. Applications of deep learning models to the electric load forecasting problem are gaining interest among researchers as well as the industry, but a comprehensive and sound comparison among different architectures is not yet available in the literature. This work aims at filling the gap by reviewing and experimentally evaluating on two real-world datasets the most recent trends in electric load forecasting, by contrasting deep learning architectures on short term forecast (one day ahead prediction). Specifically, we focus on feedforward and recurrent neural networks, sequence to sequence models and temporal convolutional neural networks along with architectural variants, which are known in the signal processing community but are novel to the load forecasting one.
The failure of a complex and safety critical industrial asset can have extremely high consequences. Close monitoring for early detection of abnormal system conditions is therefore required. Data-driven solutions to this problem have been limited for two reasons: First, safety critical assets are designed and maintained to be highly reliable and faults are rare. Fault detection can thus not be supervised. Second, complex industrial systems usually have long lifetime and face very different operating conditions. Collecting a representative training dataset would require long observation periods, and delay the monitoring of the system. In this paper, we propose a methodology to monitor the systems in their early life. To do so, we enhance the training dataset with other units from a fleet, for which longer observations are available. Since each unit has its own specificity, we propose to extract features made independent of their origin by three unsupervised feature alignment techniques. First, using a variational encoder, we impose a shared latent space for both units. Second, we introduce a new loss designed to conserve inter-point spacial relationships between the input and the latent spaces. Last, we propose to train in an adversarial manner a discriminator on the origin of the features. Once aligned, the features are fed to a one-class classifier to monitor the health of the system. By exploring the different combinations of the proposed alignment strategies, and by testing them on a real case study, a fleet composed of 112 power plants operated in different geographical locations and under very different operating regimes, we demonstrate that this alignment is necessary and beneficial.