Goto

Collaborating Authors

reactor


Google Sheets can improve your app with machine learning -- here's how

#artificialintelligence

One simple (but powerful) way Semantic ML can help us build natural-language-powered software is through a technique called embeddings. In machine learning, embeddings are a learned way of representing data in space (i.e. points plotted on an n-dimensional grid) such that the distances between points are meaningful.


Neural network accelerates plasma simulations

#artificialintelligence

By combining a deep understanding of plasma physics with machine learning techniques, DIFFER researchers developed a new ultrafast neural network model of the turbulent plasma in a fusion reactor. The neural network can accurately predict heat and particle transport in the fusion reactor up to 100.000 times faster than before: a vital tool to optimize the performance of future fusion power plants. Fusion reactors are fuelled by a plasma: a hot, ionized gas of hydrogen isotopes that fuse together at extreme temperatures to form helium and release clean energy. The behavior of the plasma is not easy to predict: the charged plasma particles respond not only to the magnetic field that keeps them trapped inside the reactor, but also to the electromagnetic fields they create themselves through their own motion. That makes predicting a fusion plasma in order to optimize its state a difficult but rewarding problem to tackle.


Google releases Semantic Reactor for natural language understanding experimentation

#artificialintelligence

Google today released Semantic Reactor, a Google Sheets add-on for experimenting with natural language models. The tech giant describes it as a demonstration of how natural language understanding (NLU) can be used with pretrained, generic AI models, as well as a means to dispel intimidation around using machine learning. "Companies are using NLU to create digital personal assistants, customer service bots, and semantic search engines for reviews, forums and the news," wrote Google AI researchers Ben Pietrzak, Steve Pucci, and Aaron Cohen in a blog post. "However, the perception that using NLU and machine learning is costly and time-consuming prevents a lot of potential users from exploring its benefits." Semantic Reactor, then, which is currently a whitelisted experiment in the Google Cloud AI Workshop, allows users to sort lines of text in a sheet using a range of AI models.


Forecasting Industrial Aging Processes with Machine Learning Methods

arXiv.org Machine Learning

By accurately predicting industrial aging processes (IAPs), it is possible to schedule maintenance events further in advance, thereby ensuring a cost-efficient and reliable operation of the plant. So far, these degradation processes were usually described by mechanistic models or simple empirical prediction models. In this paper, we evaluate a wider range of data-driven models for this task, comparing some traditional stateless models (linear and kernel ridge regression, feed-forward neural networks) to more complex recurrent neural networks (echo state networks and LSTMs). To examine how much historical data is needed to train each of the models, we first examine their performance on a synthetic dataset with known dynamics. Next, the models are tested on real-world data from a large scale chemical plant. Our results show that LSTMs produce near perfect predictions when trained on a large enough dataset, while linear models may generalize better given small datasets with changing conditions.


Delayed probe of Fukushima No. 1 reactor to push back fuel debris removal

The Japan Times

A plan to remove fuel debris from the primary containment vessel of a reactor at the Fukushima No. 1 nuclear power plant is expected to be further pushed back after it became apparent that Tokyo Electric Power Company Holdings Ltd. will not be able to conduct an internal probe -- a key step to start removing the fuel debris -- by the end of March as planned. The internal probe would involve using remote-controlled robots to collect fuel debris inside the No. 1 reactor so Tepco can examine its composition and form. Tepco's plan is to open three holes in both the outer and inner doors of the primary containment vessel using pressurized water mixed with a polishing agent. After it succeeded in opening three holes in the outer door, Tepco started drilling a hole in the inner door in June 2019. But that procedure caused the concentration of radioactive dust to increase temporarily, prompting staff to suspend work.


Sawtooth Supercomputer Coming to INL's Collaborative Computing Center

#artificialintelligence

IDAHO FALLS, Idaho, Dec. 5, 2019 – A powerful new supercomputer arrived this week at Idaho National Laboratory's Collaborative Computing Center. The machine has the power to run complex modeling and simulation applications, which are essential to developing next-generation nuclear technologies. Named after a central Idaho mountain range, Sawtooth arrives in December and will be available to users early next year. That is the highest ranking reached by an INL supercomputer. Of 102 new systems added to the list in the past six months, only three were faster than Sawtooth.


Deep Learning Expands Study Of Nuclear Waste Remediation - Pioneering Minds

#artificialintelligence

A research collaboration has achieved exaflop performance on the Summit supercomputer with a deep learning application used to model subsurface flow in the study of nuclear waste remediation. Their achievement, which will be presented during the "Deep Learning on Supercomputers" workshop at SC19, demonstrates the promise of physics-informed generative adversarial networks (GANs) for analyzing complex, large-scale science problems. The concept of physics-informed GANs is to encode prior information from physics into the neural network. This allows you to go well beyond the training domain, which is very important in applications where the conditions can change. GANs have been applied to model human face appearance with remarkable accuracy.


Fukushima farmland that became unusable in 2011 is being converted into wind and solar power plants

Daily Mail - Science & tech

Farmland in Fukushima that was rendered unusable after the disastrous 2011 nuclear meltdown is getting a second chance at productivity. A group of Japanese investors have created a new plan to use the abandoned land to build wind and solar power plants, to be used to send electricity to Tokyo. The plan calls for the construction of eleven solar power plants and ten wind power plants, at an estimated cost of $2.75 billion. Fukushima has been aggressively converting land damaged by the 2011 meltdown, such as this golf course (pictured above) into a source of renewable energy. A new $2.75 billion plan will add eleven new solar plants and ten wind power plants to former farmland The project is expected to be completed in March of 2024 and is backed by a group of investors, including Development Bank of Japan and Mizuho Bank.


Is Data the New Oil? techsocialnetwork

#artificialintelligence

Data is everywhere and its growing constantly. Its used in a multitude of ways and most of the time we don't even realize it. It is no longer being deleted but being stored for future reference and analysis so it can start to predict you! I don't want to scare you but rather inform you so that you are aware when you give your data away. Whether it be on Social Media or Shopping.


The Tech Innovations We Need to Happen if We're Going to Survive Climate Change

TIME - Tech

In the 1970s, the U.S. Department of Energy poured money into making practical a miraculous technology: the ability to convert sunlight into electricity. Solar energy was a pipe dream, far too expensive and unreliable to be considered a practical power source. But yesterday's moon shot is today's reality. The expense of solar power has fallen more quickly than expected, with installations costing about 80% less today than a decade ago. Alternative energy (like wind and solar) is now often cheaper than conventional energy (like coal and gas).