Goto

Collaborating Authors

Results


Can We Achieve Early Earthquake Prediction And Warning?

#artificialintelligence

Earthquakes claimed thousands of lives every decade. Of all-natural calamities, earthquake is the one which is most hard to predict. Even if a man succeeded in doing so, his predictions are vaguely based on the behavior of animals' minutes before the seismic waves hit that geographic region. However, with artificial intelligence algorithms can help us in receiving early warnings of a potential earthquake and be prepared accordingly. Using machine-learning models, seismologists can analyze hordes data on thousands of earthquakes.


European Space Agency enlists Airbus to help it build a Mars rover to retrieve rock samples

Daily Mail - Science & tech

The European Space Agency (ESA) has enlisted Airbus to help it build a Mars rover called'Fetch' that will find and retrieve rock samples on the Red Planet. The defence and space arm of the aerospace corporation won the upcoming phase of the contract to develop the robot as part of the'Mars Sample Return' mission. Fetch will travel across the surface of the Red Planet in 2028 collecting packaged rock samples left behind by the NASA's Perseverance rover. The four-wheeled ESA rover will ultimately transport these samples to a'Mars Ascent Vehicle' which will carry them into orbit for collection by another ESA spacecraft. The European Space Agency (ESA) has enlisted Airbus to help it build a Mars rover called'Fetch' that will find and retrieve rock samples on the Red Planet.


AI in BCI: The new era of human factor design and research

#artificialintelligence

Over the past years, progress in Artificial Intelligence and Neuroscience has made possible brain activity interaction with computers and other devices. In particular, the advancement of various signal processing methodologies such as Electroencephalogram (EEG), combined with AI-powered algorithms, have enabled us to delve into the world of Brain-Computer Interfaces and to talk about a new era of human factor design and research. Brain-Computer Interfaces refer to devices that allow users to interact with computers, measuring brain activity through EEG, which recognizes the energy and frequency patterns of the brain. There are currently two types of Brain-Computer interfaces: invasive and non-invasive, and although both have their benefits, in this article we will focus on the non-invasive BCIs. By combining knowledge from Artificial Intelligence and specifically Machine Learning, Brain-Computer Interfaces have become a vital tool in aiding the accuracy and reliability of usability testing and user experience research, allowing us to talk about a new era of human factor design.


Generating Fact Checking Explanations

arXiv.org Artificial Intelligence

Most existing work on automated fact checking is concerned with predicting the veracity of claims based on metadata, social network spread, language used in claims, and, more recently, evidence supporting or denying claims. A crucial piece of the puzzle that is still missing is to understand how to automate the most elaborate part of the process -- generating justifications for verdicts on claims. This paper provides the first study of how these explanations can be generated automatically based on available claim context, and how this task can be modelled jointly with veracity prediction. Our results indicate that optimising both objectives at the same time, rather than training them separately, improves the performance of a fact checking system. The results of a manual evaluation further suggest that the informativeness, coverage and overall quality of the generated explanations are also improved in the multi-task model.


Mixture Density Conditional Generative Adversarial Network Models (MD-CGAN)

arXiv.org Machine Learning

Generative Adversarial Networks (GANs) have gained significant attention in recent years, with particularly impressive applications highlighted in computer vision. In this work, we present a Mixture Density Conditional Generative Adversarial Model (MD-CGAN), where the generator is a Gaussian mixture model, with a focus on time series forecasting. Compared to examples in vision, there have been more limited applications of GAN models to time series. We show that our model is capable of estimating a probabilistic posterior distribution over forecasts and that, in comparison to a set of benchmark methods, the MD-CGAN model performs well, particularly in situations where noise is a significant in the time series. Further, by using a Gaussian mixture model that allows for a flexible number of mixture coefficients, the MD-CGAN offers posterior distributions that are non-Gaussian.


Autonomous discovery in the chemical sciences part I: Progress

arXiv.org Artificial Intelligence

This two-part review examines how automation has contributed to different aspects of discovery in the chemical sciences. In this first part, we describe a classification for discoveries of physical matter (molecules, materials, devices), processes, and models and how they are unified as search problems. We then introduce a set of questions and considerations relevant to assessing the extent of autonomy. Finally, we describe many case studies of discoveries accelerated by or resulting from computer assistance and automation from the domains of synthetic chemistry, drug discovery, inorganic chemistry, and materials science. These illustrate how rapid advancements in hardware automation and machine learning continue to transform the nature of experimentation and modelling. Part two reflects on these case studies and identifies a set of open challenges for the field.


aphBO-2GP-3B: A budgeted asynchronously-parallel multi-acquisition for known/unknown constrained Bayesian optimization on high-performing computing architecture

arXiv.org Machine Learning

High-fidelity complex engineering simulations are highly predictive, but also computationally expensive and often require substantial computational efforts. The mitigation of computational burden is usually enabled through parallelism in high-performance cluster (HPC) architecture. In this paper, an asynchronous constrained batch-parallel Bayesian optimization method is proposed to efficiently solve the computationally-expensive simulation-based optimization problems on the HPC platform, with a budgeted computational resource, where the maximum number of simulations is a constant. The advantages of this method are three-fold. First, the efficiency of the Bayesian optimization is improved, where multiple input locations are evaluated massively parallel in an asynchronous manner to accelerate the optimization convergence with respect to physical runtime. This efficiency feature is further improved so that when each of the inputs is finished, another input is queried without waiting for the whole batch to complete. Second, the method can handle both known and unknown constraints. Third, the proposed method considers several acquisition functions at the same time and sample based on an evolving probability mass distribution function using GP-Hedge scheme, where parameters are corresponding to the performance of each acquisition function. The proposed framework is termed aphBO-2GP-3B, which corresponds to asynchronous parallel hedge Bayesian optimization with two Gaussian processes and three batches. The aphBO-2GP-3B framework is demonstrated using two high-fidelity expensive industrial applications, where the first one is based on finite element analysis (FEA) and the second one is based on computational fluid dynamics (CFD) simulations.


An Evaluation of Change Point Detection Algorithms

arXiv.org Machine Learning

Change point detection is an important part of time series analysis, as the presence of a change point indicates an abrupt and significant change in the data generating process. While many algorithms for change point detection exist, little attention has been paid to evaluating their performance on real-world time series. Algorithms are typically evaluated on simulated data and a small number of commonly-used series with unreliable ground truth. Clearly this does not provide sufficient insight into the comparative performance of these algorithms. Therefore, instead of developing yet another change point detection method, we consider it vastly more important to properly evaluate existing algorithms on real-world data. To achieve this, we present the first data set specifically designed for the evaluation of change point detection algorithms, consisting of 37 time series from various domains. Each time series was annotated by five expert human annotators to provide ground truth on the presence and location of change points. We analyze the consistency of the human annotators, and describe evaluation metrics that can be used to measure algorithm performance in the presence of multiple ground truth annotations. Subsequently, we present a benchmark study where 13 existing algorithms are evaluated on each of the time series in the data set. This study shows that binary segmentation (Scott and Knott, 1974) and Bayesian online change point detection (Adams and MacKay, 2007) are among the best performing methods. Our aim is that this data set will serve as a proving ground in the development of novel change point detection algorithms.


A new hybrid approach for crude oil price forecasting: Evidence from multi-scale data

arXiv.org Machine Learning

Faced with the growing research towards crude oil price fluctuations influential factors following the accelerated development of Internet technology, accessible data such as Google search volume index are increasingly quantified and incorporated into forecasting approaches. In this paper, we apply multi-scale data that including both GSVI data and traditional economic data related to crude oil price as independent variables and propose a new hybrid approach for monthly crude oil price forecasting. This hybrid approach, based on divide and conquer strategy, consists of K-means method, kernel principal component analysis and kernel extreme learning machine , where K-means method is adopted to divide input data into certain clusters, KPCA is applied to reduce dimension, and KELM is employed for final crude oil price forecasting. The empirical result can be analyzed from data and method levels. At the data level, GSVI data perform better than economic data in level forecasting accuracy but with opposite performance in directional forecasting accuracy because of Herd Behavior, while hybrid data combined their advantages and obtain best forecasting performance in both level and directional accuracy. At the method level, the approaches with K-means perform better than those without K-means, which demonstrates that divide and conquer strategy can effectively improve the forecasting performance.


Highway Administration to Explore How AI and Blockchain Can Transform Transportation

#artificialintelligence

The Federal Highway Administration launched an Exploratory Advanced Research Program this week to usher in "transformational changes and truly revolutionary advances" in highway engineering and intermodal transportation on roads across the United States. According to a new broad agency announcement, the administration is accepting research effort proposals--with the deliberate intent of awarding either contracts or cooperative agreements--that address three trendy topics in emerging tech: blockchain for highway transportation, artificial intelligence for highway transportation, and incorporating trashed plastic into asphalt cement to reduce waste. "This program supports scientific investigations and studies that advance the current knowledge and state-of-the-art in the sciences and technologies employed in the planning, design, construction, operation, maintenance and management of the nation's highways," officials wrote in the announcement. "Strategically, this research will enable and expedite the development of revolutionary approaches, methodologies, and breakthroughs required to drive innovation and greatly improve the efficiency of highway transportation." The agency's EAR programs aim to produce strong public-private partnerships that catalyze solutions through "longer-term, higher risk" research.