Goto

Collaborating Authors

 energy


Compositional Visual Generation with Energy Based Models

Neural Information Processing Systems

A vital aspect of human intelligence is the ability to compose increasingly complex concepts out of simpler ideas, enabling both rapid learning and adaptation of knowledge. In this paper we show that energy-based models can exhibit this ability by directly combining probability distributions. Samples from the combined distribution correspond to compositions of concepts. For example, given a distribution for smiling faces, and another for male faces, we can combine them to generate smiling male faces. This allows us to generate natural images that simultaneously satisfy conjunctions, disjunctions, and negations of concepts. We evaluate compositional generation abilities of our model on the CelebA dataset of natural faces and synthetic 3D scene images. We also demonstrate other unique advantages of our model, such as the ability to continually learn and incorporate new concepts, or infer compositions of concept properties underlying an image.


Arbitrary Conditional Distributions with Energy

Neural Information Processing Systems

Modeling distributions of covariates, or density estimation, is a core challenge in unsupervised learning. However, the majority of work only considers the joint distribution, which has limited relevance to practical situations. A more general and useful problem is arbitrary conditional density estimation, which aims to model any possible conditional distribution over a set of covariates, reflecting the more realistic setting of inference based on prior knowledge. We propose a novel method, Arbitrary Conditioning with Energy (ACE), that can simultaneously estimate the distribution $p(\mathbf{x}_u \mid \mathbf{x}_o)$ for all possible subsets of unobserved features $\mathbf{x}_u$ and observed features $\mathbf{x}_o$. ACE is designed to avoid unnecessary bias and complexity --- we specify densities with a highly expressive energy function and reduce the problem to only learning one-dimensional conditionals (from which more complex distributions can be recovered during inference). This results in an approach that is both simpler and higher-performing than prior methods. We show that ACE achieves state-of-the-art for arbitrary conditional likelihood estimation and data imputation on standard benchmarks.


Forecasting Monthly Residential Natural Gas Demand Using Just-In-Time-Learning Modeling

Alakent, Burak, Isikli, Erkan, Kadaifci, Cigdem, Taspinar, Tonguc S.

arXiv.org Machine Learning

ABSTRACT Natural gas (NG) is relatively a clean source of energy, particularly compared to fossil fuels, and worldwide consumption of NG has been increasing almost linearly in the last two decades. A similar trend can also be seen in Turkey, while another similarity is the high dependence on impor ts for the continuous NG supply. It is crucial to accurately forecast future NG demand (NGD) in Turkey, especially, for import contracts; in this respect, forecasts of monthly NGD for the following year are of utmost importance. In the current study, the h istorical monthly NG consumption data between 2014 and 2024 provided by SOCAR, the local residential NG distribution company for two cities in Turkey, Bursa and Kayseri, was used to determine out - of - sample monthly NGD forecasts for a period of one year and nine months using various time series models, including SARIMA and ETS models, and a novel proposed machine learning method. The proposed method, named Just - in - Time - Learning - Gaussia n Process Regression (JITL - GPR), uses a novel feature representation for t he past NG demand values; instead of using past demand values as column - wise separate features, they are placed on a two - dimensional (2 - D) grid of year - month values. For each test point, a kernel function, tailored for the NGD predictions, is used in GPR t o predict the query point. Since a model is constructed separately for each test point, the proposed method is, indeed, an example of JITL. The JITL - GPR method is easy to use and optimize, and offers a reduction in forecast errors compared to traditional t ime series methods and a state - of - the - art combinat ion model; therefore, it is a promising tool for NGD forecasting in similar settings. INTRODUCTION In the last few decades, there has been a shift in energy sources from fossil fuels to cleaner energy sources, such as wind and solar energy, mainly due to environmental concerns and related government regulations . However, these latter sources are depend ent on w eather conditions and require integration with grid technologies for continuous power generation. Natural gas (NG), typically, consists of (up to) ~95% of methane and 2 - 2.5% ethane - hexane+, with the remain der consist ing of nitrogen, CO NG p ower plants are easy to build and highly reliable, mak ing them invaluable for "clean" energy production. On the other hand, m ost countries depend on imports to maintain t heir NG supplies, and there is a delicate balance between import s and domestic demand . S toring excess import ed gas above actual demand is difficult and would result in economic losses, while import ing less than actual demand could result in a nationwide sh ortage.


A data-science approach to predict the heat capacity of nanoporous materials - Nature Materials

#artificialintelligence

The heat capacity of a material is a fundamental property of great practical importance. For example, in a carbon capture process, the heat required to regenerate a solid sorbent is directly related to the heat capacity of the material. However, for most materials suitable for carbon capture applications, the heat capacity is not known, and thus the standard procedure is to assume the same value for all materials. In this work, we developed a machine learning approach, trained on density functional theory simulations, to accurately predict the heat capacity of these materials, that is, zeolites, metal–organic frameworks and covalent–organic frameworks. The accuracy of our prediction is confirmed with experimental data. Finally, for a temperature swing adsorption process that captures carbon from the flue gas of a coal-fired power plant, we show that for some materials, the heat requirement is reduced by as much as a factor of two using the correct heat capacity. Heat capacity of nanoporous materials is important for processes such as carbon capture, as this can affect process design energy requirements. Here, a machine learning approach for heat capacity prediction, trained on density functional theory simulations, is presented and experimentally verified.


BBC 4.1 joins the AI revolution with two nights of AI-generated programmes TheINQUIRER

#artificialintelligence

AUNTIE BEEB is embracing the AI revolution with two nights of programming generated by a neural network offering a juxtaposition between bleeding edge tech and vintage television. Eagle-eyed viewers will have spotted'BBC 4.1 - Artificial Intelligence TV' has been trailing for a couple of weeks, assuring viewers they can'Relax - It's going to be fine'. Alongside programming about AI itself, 'Made by Machine: When AI met The Archive' will show a range of classic clips from over 250,000 shows since 1953, selected by an AI, trained to know what BBC Four is, what it shows and what its viewers will like. The experimental programming has unearthed some'hidden gems' that haven't been seen in years, and which manual research alone would have taken hundreds of hours of research - if indeed they were found at all. The slight elephant in the room is that, given that the BBC recycled and junked many master tapes during the 1970s and 1980s, some of the suggestions may no longer exist.


18 exponential changes we can expect in the year ahead

#artificialintelligence

Azeem Azhar is a strategist, product entrepreneur, and analyst living in London. He is the curator of the weekly newsletter Exponential View, from which the following is adapted. This is the first year I am presenting predictions for the coming year. I've received some incredibly helpful comments from readers via Twitter. This has encouraged me to stick my head above the parapet.


Quantum Machine Learning: An Overview

@machinelearnbot

At a recent conference in 2017, Microsoft CEO Satya Nadella used the analogy of a corn maze to explain the difference in approach between a classical computer and a quantum computer. In trying to find a path through the maze, a classical computer would start down a path, hit an obstruction, backtrack; start again, hit another obstruction, backtrack again until it ran out of options. Although an answer can be found, this approach could be a very time-consuming. They take every path in the corn maze simultaneously." Thus, leading to an exponential reduction in the number of steps required to solve a problem.


Thinking Fast and Slow: An Approach to Energy-Efficient Human Activity Recognition on Mobile Devices

AI Magazine

Inspired by this model, we propose a framework for implementing human activity recognition on mobile devices. In this area, the mobile app is usually always on and the general challenge is how to balance accuracy and energy consumption. However, among existing approaches, those based on cellular IDs consume little power but are less accurate; those based on GPS/Wi-Fi sampling are accurate often at the costs of battery drainage; moreover, previous methods in general do not improve over time. To address these challenges, our framework consists of two modes: In the deliberation mode, the system learns cell ID patterns that are trained by existing GPS-/Wi-Fi-based methods; in the intuition mode, only the learned cell ID patterns are used for activity recognition, which is both accurate and energy efficient; system parameters are learned to control the transition from deliberation to intuition, when sufficient confidence is gained, and the transition from intuition to deliberation, when more training is needed. For the scope of this paper, we first elaborate our framework in a subproblem in activity recognition, trip detection, which recognizes significant places and trips between them.


Sustainable Policy Making: A Strategic Challenge for Artificial Intelligence

AI Magazine

Each political decision in fact implies some form of social reactions, it affects economic and financial aspects and has substantial environmental impacts. Improving decision making in this context could have a huge beneficial impact on all these aspects. There are a number of Artificial Intelligence techniques that could play an important role in improving the policy-making process such as decision support and optimization techniques, game theory, data and opinion mining and agent-based simulation. We outline here some potential use of AI technology as it emerged by the European Union (EU) EU FP7 project ePolicy: Engineering the Policy Making Life Cycle, and we identify some potential research challenges. They are extremely complex, occur in rapidly changing environments characterized by uncertainty, and involve conflicts among different interests.


Artificial Intelligence for Human-Robot Interaction

AI Magazine

The titles of the seven symposia were Artificial Intelligence for Human-Robot Interaction; Energy Market Prediction; Expanding the Boundaries of Health Informatics Using AI; Knowledge, Skill, and Behavior Transfer in Autonomous Robots; Modeling Changing Perspectives: Reconceptualizing Sensorimotor Experiences; Natural Language Access to Big Data; and The Nature of Humans and Machines: A Multidisciplinary Discourse. The highlights of each symposium are presented in this report. The primary goal of the AI for Human-Robot Interaction symposium was to bring together and strengthen the community of researchers working on the AI challenges inherent to human-robot interaction (HRI). HRI is an extremely interesting problem domain for AI and robotics research. It aims to develop robots that are intelligent, autonomous, and capable of interacting with, modeling, and learning from humans.