Bilokon, Paul
Applying Deep Learning to Calibrate Stochastic Volatility Models
Sridi, Abir, Bilokon, Paul
Stochastic volatility models, where the volatility is a stochastic process, can capture most of the essential stylized facts of implied volatility surfaces and give more realistic dynamics of the volatility smile/skew. However, they come with the significant issue that they take too long to calibrate. Alternative calibration methods based on Deep Learning (DL) techniques have been recently used to build fast and accurate solutions to the calibration problem. Huge and Savine developed a Differential Machine Learning (DML) approach, where Machine Learning models are trained on samples of not only features and labels but also differentials of labels to features. The present work aims to apply the DML technique to price vanilla European options (i.e. the calibration instruments), more specifically, puts when the underlying asset follows a Heston model and then calibrate the model on the trained network. DML allows for fast training and accurate pricing. The trained neural network dramatically reduces Heston calibration's computation time. In this work, we also introduce different regularisation techniques, and we apply them notably in the case of the DML. We compare their performance in reducing overfitting and improving the generalisation error. The DML performance is also compared to the classical DL (without differentiation) one in the case of Feed-Forward Neural Networks. We show that the DML outperforms the DL. The complete code for our experiments is provided in the GitHub repository: https://github.com/asridi/DML-Calibration-Heston-Model
Transformers versus LSTMs for electronic trading
Bilokon, Paul, Qiu, Yitao
With the rapid development of artificial intelligence, long short term memory (LSTM), one kind of recurrent neural network (RNN), has been widely applied in time series prediction. Like RNN, Transformer is designed to handle the sequential data. As Transformer achieved great success in Natural Language Processing (NLP), researchers got interested in Transformer's performance on time series prediction, and plenty of Transformer-based solutions on long time series forecasting have come out recently. However, when it comes to financial time series prediction, LSTM is still a dominant architecture. Therefore, the question this study wants to answer is: whether the Transformer-based model can be applied in financial time series prediction and beat LSTM. To answer this question, various LSTM-based and Transformer-based models are compared on multiple financial prediction tasks based on high-frequency limit order book data. A new LSTM-based model called DLSTM is built and new architecture for the Transformer-based model is designed to adapt for financial prediction. The experiment result reflects that the Transformer-based model only has the limited advantage in absolute price sequence prediction. The LSTM-based models show better and more robust performance on difference sequence prediction, such as price difference and price movement.
From Deep Filtering to Deep Econometrics
Stok, Robert, Bilokon, Paul
Calculating true volatility is an essential task for option pricing and risk management. However, it is made difficult by market microstructure noise. Particle filtering has been proposed to solve this problem as it favorable statistical properties, but relies on assumptions about underlying market dynamics. Machine learning methods have also been proposed but lack interpretability, and often lag in performance. In this paper we implement the SV-PF-RNN: a hybrid neural network and particle filter architecture. Our SV-PF-RNN is designed specifically with stochastic volatility estimation in mind. We then show that it can improve on the performance of a basic particle filter.
A compendium of data sources for data science, machine learning, and artificial intelligence
Bilokon, Paul, Bilokon, Oleksandr, Amen, Saeed
Recent advances in data science, machine learning, and artificial intelligence, such as the emergence of large language models, are leading to an increasing demand for data that can be processed by such models. While data sources are application-specific, and it is impossible to produce an exhaustive list of such data sources, it seems that a comprehensive, rather than complete, list would still benefit data scientists and machine learning experts of all levels of seniority. The goal of this publication is to provide just such an (inevitably incomplete) list -- or compendium -- of data sources across multiple areas of applications, including finance and economics, legal (laws and regulations), life sciences (medicine and drug discovery), news sentiment and social media, retail and ecommerce, satellite imagery, and shipping and logistics, and sports.
Exploring the Advantages of Transformers for High-Frequency Trading
Barez, Fazl, Bilokon, Paul, Gervais, Arthur, Lisitsyn, Nikita
Forecasting Financial Time Series (FTS) has been of interest to financial market participants who are interested in making profitable trades on the financial markets. It has historically been approached using stochastic and machine learning models. Stochastic methods include linear models such as Autoregressive Integrated Moving Average (ARIMA) [1] that support non-stationary time series and non-linear models, including the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) [2] model. Machine learning methods are data-driven approaches, among which Recurrent Neural Networks (RNNs) [3], more specifically, Long Short-Term Memory (LSTM) networks [4], have been especially popular for time series prediction. Periodically, new deep learning models are being adopted in quantitative research to find the most accurate models in FTS forecasting that would lead to more efficient trading strategies. Recently, a new type of deep learning [5] architecture called Transformer [6], relying on Attention [7], was introduced for Natural Language Processing (NLP) applications. Transformers have since been used in other applications such as computer vision tasks [8] and more recently in time series forecasting. This paper will focus on the application of Transformers in high-frequency FTS forecasting. FTS are characterized by properties including frequency, auto-correlation, heteroskedasticity, drift, and seasonality [9].
Identification and validation of Triamcinolone and Gallopamil as treatments for early COVID-19 via an in silico repurposing pipeline
MacMahon, Méabh, Hwang, Woochang, Yim, Soorin, MacMahon, Eoghan, Abraham, Alexandre, Barton, Justin, Tharmakulasingam, Mukunthan, Bilokon, Paul, Gaddi, Vasanthi Priyadarshini, Han, Namshik
SARS-CoV-2, the causative virus of COVID-19 continues to cause an ongoing global pandemic. Therapeutics are still needed to treat mild and severe COVID-19. Drug repurposing provides an opportunity to deploy drugs for COVID-19 more rapidly than developing novel therapeutics. Some existing drugs have shown promise for treating COVID-19 in clinical trials. This in silico study uses structural similarity to clinical trial drugs to identify two drugs with potential applications to treat early COVID-19. We apply in silico validation to suggest a possible mechanism of action for both. Triamcinolone is a corticosteroid structurally similar to Dexamethasone. Gallopamil is a calcium channel blocker structurally similar to Verapamil. We propose that both these drugs could be useful to treat early COVID-19 infection due to the proximity of their targets within a SARS-CoV-2-induced protein-protein interaction network to kinases active in early infection, and the APOA1 protein which is linked to the spread of COVID-19.