Goto

Collaborating Authors

Financial Engineering and Artificial Intelligence in Python

#artificialintelligence

Created by Lazy Programmer Inc, Lazy Programmer Team 14.5 hours on-demand video course Have you ever thought about what would happen if you combined the power of machine learning and artificial intelligence with financial engineering? Today, you can stop imagining, and start doing. This course will teach you the core fundamentals of financial engineering, with a machine learning twist.


Hidden Markov Models for Regime Detection using R - QuantStart

#artificialintelligence

In the previous article in the series Hidden Markov Models were introduced. They were discussed in the context of the broader class of Markov Models. They were motivated by the need for quantitative traders to have the ability to detect market regimes in order to adjust how their quant strategies are managed. In particular it was mentioned that "various regimes lead to adjustments of asset returns via shifts in their means, variances/volatilities, serial correlation and covariances, which impact the effectiveness of time series methods that rely on stationarity". This has a significant bearing on how trading strategies are modified throughout the strategy lifecycle.


Smelling Source Code Using Deep Learning

#artificialintelligence

Poor quality code contributes to increasing technical debt and makes the software difficult to extend and maintain. Code smells capture such poor code quality practices. Traditionally, the software engineering community identifies code smells in deterministic ways by using metrics and pre-defined rules/heuristics. Creating a deterministic tool for a specific language is an expensive and arduous task since it requires source code analysis starting from parsing, symbol resolution, intermediate model preparation, and applying rules/heuristics/metrics on the model. It would be great if we can leverage the tools available for one programming language and cross-apply them on another language.


Great machine learning starts with resourceful feature engineering

#artificialintelligence

I recently read an article in which the winner of a Kaggle Competition was not shy about sharing his technique for winning not one, but several of the analytical competitions. "I always use Gradient Boosting," he said. And then added, "but the key is Feature Engineering." A couple days later, a friend who read the same article called and asked, "What is this Feature Engineering that he's talking about?" It was a timely question, as I was in the process of developing a risk model for a client, and specifically, I was working through the stage of Feature Engineering.


Great machine learning starts with resourceful feature engineering

#artificialintelligence

I recently read an article in which the winner of a Kaggle Competition was not shy about sharing his technique for winning not one, but several of the analytical competitions. "I always use Gradient Boosting," he said. And then added, "but the key is Feature Engineering." A couple days later, a friend who read the same article called and asked, "What is this Feature Engineering that he's talking about?" It was a timely question, as I was in the process of developing a risk model for a client, and specifically, I was working through the stage of Feature Engineering.