Dynamic Boltzmann Machines for Second Order Moments and Generalized Gaussian Distributions

arXiv.org Machine Learning

Dynamic Boltzmann Machine (DyBM) has been shown highly efficient to predict time-series data. Gaussian DyBM is a DyBM that assumes the predicted data is generated by a Gaussian distribution whose first-order moment (mean) dynamically changes over time but its second-order moment (variance) is fixed. However, in many financial applications, the assumption is quite limiting in two aspects. First, even when the data follows a Gaussian distribution, its variance may change over time. Such variance is also related to important temporal economic indicators such as the market volatility. Second, financial time-series data often requires learning datasets generated by the generalized Gaussian distribution with an additional shape parameter that is important to approximate heavy-tailed distributions. Addressing those aspects, we show how to extend DyBM that results in significant performance improvement in predicting financial time-series data.


Bayesian Neural Networks: Bayes' Theorem Applied to Deep Learning

#artificialintelligence

The article was written by Amber Zhou, a Financial Analyst at I Know First. Deep learning has become a buzzward in recent years. In fact, it has once gained much attention and excitements under the name neural networks early back in 1980's. However due to the lack of sufficient compute power and training examples, it gradually experienced a depression in the following decade. As we are entering the Era of Big Data in light of the explosion of computer power, deep learning has recently seen a revival.


Synechron launches AI data science accelerators for FS firms

#artificialintelligence

These four new solution accelerators help financial services and insurance firms solve complex business challenges by discovering meaningful relationships between events that impact one another (correlation) and cause a future event to happen (causation). Following the success of Synechron's AI Automation Program – Neo, Synechron's AI Data Science experts have developed a powerful set of accelerators that allow financial firms to address business challenges related to investment research generation, predicting the next best action to take with a wealth management client, high-priority customer complaints, and better predicting credit risk related to mortgage lending. The Accelerators combine Natural Language Processing (NLP), Deep Learning algorithms and Data Science to solve the complex business challenges and rely on a powerful Spark and Hadoop platform to ingest and run correlations across massive amounts of data to test hypotheses and predict future outcomes. The Data Science Accelerators are the fifth Accelerator program Synechron has launched in the last two years through its Financial Innovation Labs (FinLabs), which are operating in 11 key global financial markets across North America, Europe, Middle East and APAC; including: New York, Charlotte, Fort Lauderdale, London, Paris, Amsterdam, Serbia, Dubai, Pune, Bangalore and Hyderabad. With this, Synechron's Global Accelerator programs now includes over 50 Accelerators for: Blockchain, AI Automation, InsurTech, RegTech, and AI Data Science and a dedicated team of over 300 employees globally.


Why is machine learning in finance so hard?

#artificialintelligence

Financial markets have been one of the earliest adopters of machine learning (ML). People have been using ML to spot patterns in the markets since 1980s. Even though ML has had enormous successes in predicting the market outcomes in the past, the recent advances in deep learning haven't helped financial market predictions much. While deep learning and other ML techniques have finally made it possible for Alexa, Google Assistant and Google Photos to work, there hasn't been much progress when it comes to stock markets.


Learning from multivariate discrete sequential data using a restricted Boltzmann machine model

arXiv.org Machine Learning

A restricted Boltzmann machine (RBM) is a generative neural-network model with many novel applications such as collaborative filtering and acoustic modeling. An RBM lacks the capacity to retain memory, making it inappropriate for dynamic data modeling as in time-series analysis. In this paper we address this issue by proposing the p-RBM model, a generalization of the regular RBM model, capable of retaining memory of p past states. We further show how to train the p-RBM model using contrastive divergence and test our model on the problem of predicting the stock market direction considering 100 stocks of the NASDAQ-100 index. Obtained results show that the p-RBM offer promising prediction potential.