Exploring the Advantages of Transformers for High-Frequency Trading

Barez, Fazl, Bilokon, Paul, Gervais, Arthur, Lisitsyn, Nikita

arXiv.org Artificial Intelligence 

Forecasting Financial Time Series (FTS) has been of interest to financial market participants who are interested in making profitable trades on the financial markets. It has historically been approached using stochastic and machine learning models. Stochastic methods include linear models such as Autoregressive Integrated Moving Average (ARIMA) [1] that support non-stationary time series and non-linear models, including the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) [2] model. Machine learning methods are data-driven approaches, among which Recurrent Neural Networks (RNNs) [3], more specifically, Long Short-Term Memory (LSTM) networks [4], have been especially popular for time series prediction. Periodically, new deep learning models are being adopted in quantitative research to find the most accurate models in FTS forecasting that would lead to more efficient trading strategies. Recently, a new type of deep learning [5] architecture called Transformer [6], relying on Attention [7], was introduced for Natural Language Processing (NLP) applications. Transformers have since been used in other applications such as computer vision tasks [8] and more recently in time series forecasting. This paper will focus on the application of Transformers in high-frequency FTS forecasting. FTS are characterized by properties including frequency, auto-correlation, heteroskedasticity, drift, and seasonality [9].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found