If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Google today released Semantic Reactor, a Google Sheets add-on for experimenting with natural language models. The tech giant describes it as a demonstration of how natural language understanding (NLU) can be used with pretrained, generic AI models, as well as a means to dispel intimidation around using machine learning. "Companies are using NLU to create digital personal assistants, customer service bots, and semantic search engines for reviews, forums and the news," wrote Google AI researchers Ben Pietrzak, Steve Pucci, and Aaron Cohen in a blog post. "However, the perception that using NLU and machine learning is costly and time-consuming prevents a lot of potential users from exploring its benefits." Semantic Reactor, then, which is currently a whitelisted experiment in the Google Cloud AI Workshop, allows users to sort lines of text in a sheet using a range of AI models.
By accurately predicting industrial aging processes (IAPs), it is possible to schedule maintenance events further in advance, thereby ensuring a cost-efficient and reliable operation of the plant. So far, these degradation processes were usually described by mechanistic models or simple empirical prediction models. In this paper, we evaluate a wider range of data-driven models for this task, comparing some traditional stateless models (linear and kernel ridge regression, feed-forward neural networks) to more complex recurrent neural networks (echo state networks and LSTMs). To examine how much historical data is needed to train each of the models, we first examine their performance on a synthetic dataset with known dynamics. Next, the models are tested on real-world data from a large scale chemical plant. Our results show that LSTMs produce near perfect predictions when trained on a large enough dataset, while linear models may generalize better given small datasets with changing conditions.
A plan to remove fuel debris from the primary containment vessel of a reactor at the Fukushima No. 1 nuclear power plant is expected to be further pushed back after it became apparent that Tokyo Electric Power Company Holdings Ltd. will not be able to conduct an internal probe -- a key step to start removing the fuel debris -- by the end of March as planned. The internal probe would involve using remote-controlled robots to collect fuel debris inside the No. 1 reactor so Tepco can examine its composition and form. Tepco's plan is to open three holes in both the outer and inner doors of the primary containment vessel using pressurized water mixed with a polishing agent. After it succeeded in opening three holes in the outer door, Tepco started drilling a hole in the inner door in June 2019. But that procedure caused the concentration of radioactive dust to increase temporarily, prompting staff to suspend work.
Wildlife is flourishing in the exclusion zone around the disabled Fukushima Daichii nuclear reactor in Japan, images from remotely-operated cameras have revealed. Researchers spotted more than 20 species in areas around the reactor, including wild boar, macaques and fox-like raccoon dogs. The findings help reveal how wildlife populations respond in the wake of catastrophic nuclear disaster like those that occurred at Fukushima and Chernobyl. Humans were evacuated from certain zones around the the Fukushima reactor following radiation leaks caused by the Tōhoku earthquake and tsunami of 2011. Wildlife ecologist James Beasley of the University of Georgia, in the US, and colleagues used a network of 106 remote cameras to capture images of the wildlife in the area around the Fukushima Daiichi power plant over a four-month period.
A research collaboration has achieved exaflop performance on the Summit supercomputer with a deep learning application used to model subsurface flow in the study of nuclear waste remediation. Their achievement, which will be presented during the "Deep Learning on Supercomputers" workshop at SC19, demonstrates the promise of physics-informed generative adversarial networks (GANs) for analyzing complex, large-scale science problems. The concept of physics-informed GANs is to encode prior information from physics into the neural network. This allows you to go well beyond the training domain, which is very important in applications where the conditions can change. GANs have been applied to model human face appearance with remarkable accuracy.
Human brains are extremely energy-efficient. When a person thinks in a concentrated manner, his or her brain consumes a mere 21 watts of electricity. But AI doing the same degree of intensive thinking requires over 10,000 times more electricity. If that is the case, the international competitiveness of businesses will depend on factors concerning the supply and cost of electricity in their home country. How, then, does Japan stand with regard to power supply and cost?
Data is everywhere and its growing constantly. Its used in a multitude of ways and most of the time we don't even realize it. It is no longer being deleted but being stored for future reference and analysis so it can start to predict you! I don't want to scare you but rather inform you so that you are aware when you give your data away. Whether it be on Social Media or Shopping.
In the 1970s, the U.S. Department of Energy poured money into making practical a miraculous technology: the ability to convert sunlight into electricity. Solar energy was a pipe dream, far too expensive and unreliable to be considered a practical power source. But yesterday's moon shot is today's reality. The expense of solar power has fallen more quickly than expected, with installations costing about 80% less today than a decade ago. Alternative energy (like wind and solar) is now often cheaper than conventional energy (like coal and gas).
In computer chip manufacturing, the study of etch patterns on silicon wafers, or metrology, occurs on the nano-scale and is therefore subject to large variation from small, yet significant, perturbations in the manufacturing environment. An enormous amount of information can be gathered from a single etch process, a sequence of actions taken to produce an etched wafer from a blank piece of silicon. Each final wafer, however, is costly to take measurements from, which limits the number of examples available to train a predictive model. Part of the significance of this work is the success we saw from the models despite the limited number of examples. In order to accommodate the high dimensional process signatures, we isolated important sensor variables and applied domain-specific summarization on the data using multiple feature engineering techniques. We used a neural network architecture consisting of the summarized inputs, a single hidden layer of 4032 units, and an output layer of one unit. Two different models were learned, corresponding to the metrology measurements in the dataset, Recess and Remaining Mask. The outputs are related abstractly and do not form a two dimensional space, thus two separate models were learned. Our results approach the error tolerance of the microscopic imaging system. The model can make predictions for a class of etch recipes that include the correct number of etch steps and plasma reactors with the appropriate sensors, which are chambers containing an ionized gas that determine the manufacture environment. Notably, this method is not restricted to some maximum process length due to the summarization techniques used. This allows the method to be adapted to new processes that satisfy the aforementioned requirements. In order to automate semiconductor manufacturing, models like these will be needed throughout the process to evaluate production quality.
Japan has told the United States it is ready to provide its robot technology for use in dismantling nuclear and uranium enrichment facilities in North Korea as Washington and Pyongyang pursue further denuclearization talks, government sources said Friday. As Japan turns to the remotely controlled robots it has developed to decommission reactors crippled by the triple core meltdown in 2011 at the Fukushima No. 1 power plant, it believes the same technology can be used in North Korea, according to the sources. The offer is part of Japan's efforts to make its own contribution to the denuclearization talks amid concern that Tokyo could be left out of the loop as the United States and North Korea step up diplomacy. Tokyo has already told Washington it would shoulder part of the costs of any International Atomic Energy Agency inspections of North Korean facilities and dispatch its own nuclear experts to help. The scrapping of nuclear facilities, such as the Yongbyon complex, which has a graphite-moderated reactor, will come into focus in forthcoming working-level talks between Washington and Pyongyang.