Temporal Pattern Attention for Multivariate Time Series Forecasting

arXiv.org Machine Learning

Forecasting multivariate time series data, such as prediction of electricity consumption, solar power production, and polyphonic piano pieces, has numerous valuable applications. However, complex and non-linear interdependencies between time steps and series complicate the task. To obtain accurate prediction, it is crucial to model long-term dependency in time series data, which can be achieved to some good extent by recurrent neural network (RNN) with attention mechanism. Typical attention mechanism reviews the information at each previous time step and selects the relevant information to help generate the outputs, but it fails to capture the temporal patterns across multiple time steps. In this paper, we propose to use a set of filters to extract time-invariant temporal patterns, which is similar to transforming time series data into its "frequency domain". Then we proposed a novel attention mechanism to select relevant time series, and use its "frequency domain" information for forecasting. We applied the proposed model on several real-world tasks and achieved the state-of-the-art performance in all of them with only one exception. We also show that to some degree the learned filters play the role of bases in discrete Fourier transform.


Machine Learning on EPEX Order Books: Insights and Forecasts

arXiv.org Machine Learning

Forecasting electricity prices is an important task in an energy utility and needed not only for proprietary trading but also for the optimisation of power plant production schedules and other technical issues. A promising approach in power price forecasting is based on a recalculation of the order book using forecasts on market fundamentals like demand or renewable infeed. However, this approach requires extensive statistical analysis of market data. In this paper, we examine if and how this statistical work can be reduced using machine learning. Our paper focuses on two research questions: - How can order books from electricity markets be included in machine learning algorithms? - How can order-book-based spot price forecasts be improved using machine learning? We consider the German/Austrian EPEX spot market for electricity. There is a daily auction for electricity with delivery the next day. All 24 hours of the day are traded as separate products.


Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting

arXiv.org Machine Learning

Time series forecasting is an important problem across many domains, including predictions of solar plant energy output, electricity consumption, and traffic jam situation. In this paper, we propose to tackle such forecasting problem with Transformer. Although impressed by its performance in our preliminary study, we found its two major weaknesses: (1) locality-agnostics: the point-wise dot-product self attention in canonical Transformer architecture is insensitive to local context, which can make the model prone to anomalies in time series; (2) memory bottleneck: space complexity of canonical Transformer grows quadratically with sequence length $L$, making modeling long time series infeasible. In order to solve these two issues, we first propose convolutional self attention by producing queries and keys with causal convolution so that local context can be better incorporated into attention mechanism. Then, we propose LogSparse Transformer with only $O(L(\log L)^{2})$ memory cost, improving the time series forecasting in finer granularity under constrained memory budget. Our experiments on both synthetic data and real-world datasets show that it compares favorably to the state-of-the-art.


A Memory-Network Based Solution for Multivariate Time-Series Forecasting

arXiv.org Machine Learning

Multivariate time series forecasting is extensively studied throughout the years with ubiquitous applications in areas such as finance, traffic, environment, etc. Still, concerns have been raised on traditional methods for incapable of modeling complex patterns or dependencies lying in real word data. To address such concerns, various deep learning models, mainly Recurrent Neural Network (RNN) based methods, are proposed. Nevertheless, capturing extremely long-term patterns while effectively incorporating information from other variables remains a challenge for time-series forecasting. Furthermore, lack-of-explainability remains one serious drawback for deep neural network models. Inspired by Memory Network proposed for solving the question-answering task, we propose a deep learning based model named Memory Time-series network (MTNet) for time series forecasting. MTNet consists of a large memory component, three separate encoders, and an autoregressive component to train jointly. Additionally, the attention mechanism designed enable MTNet to be highly interpretable. We can easily tell which part of the historic data is referenced the most.


NASA Applies IntelAI's Machine Learning Methods to Search for Space Resources – technerdbites

#artificialintelligence

The State Government of South Australia announced their contract with Solar Reserve to build a 150MW solar thermal power plant for Port Augusta, South Australia. This is an addition to the state-owned gas plant and the world's largest lithium ion battery recently announced contract with Tesla.