# Markov Models

Sign languages aren't easy to learn and are even harder to teach. They use not just hand gestures but also mouthings, facial expressions and body posture to communicate meaning. This complexity means professional teaching programmes are still rare and often expensive. But this could all change soon, with a little help from artificial intelligence (AI). My colleagues and I are working on software for teaching yourself sign languages in an automated, intuitive way.

Sign languages aren't easy to learn and are even harder to teach. They use not just hand gestures but also mouthings, facial expressions and body posture to communicate meaning. This complexity means professional teaching programmes are still rare and often expensive. But this could all change soon, with a little help from artificial intelligence (AI). My colleagues and I are working on software for teaching yourself sign languages in an automated, intuitive way.

### Marketing Analytics through Markov Chain – Data Science Central

Imagine you are a company selling a fast-moving consumer good in the market. Let's assume that the customer would follow the given journey to make the final purchase: These are the states at which the customer would be at any point in the purchase journey. Now, how to find out in which state the customers would be after 6 months? Markov Chain comes to the rescue!! Let's first understand what Markov Chain is. Let's delve a little deeper.

### Deep Learning meets Physics: Restricted Boltzmann Machines Part I

In my opinion RBMs have one of the easiest architectures of all neural networks. As it can be seen in Fig.1. The absence of an output layer is apparent. But as it can be seen later an output layer wont be needed since the predictions are made differently as in regular feedforward neural networks. Energy is a term that may not be associated with deep learning in the first place.

### Generating Haiku with Deep Learning – Towards Data Science

I've done previous work on haiku generation. This generator uses Markov chains trained on a corpus of non-haiku poetry, generates haiku one word at a time, and ensures the 5-7-5 structure by backspacing when all the possible next words would violate the 5–7–5 structure. This isn't unlike what I do when I'm writing a haiku. I try things, count out the syllables, find they don't work and go back. It feels more like brute force than something that actually understands what it means to write a haiku.