The Coming Revolution in Recurrent Neural Nets (RNNs)

#artificialintelligence 

Summary: Recurrent Neural Nets (RNNs) are at the core of the most common AI applications in use today but we are rapidly recognizing broad time series problem types where they don't fit well. Several alternatives are already in use and one that's just been introduced, ODE net is a radical departure from our way of thinking about the solution. Recurrent Neural Nets (RNNs) and their cousins LSTMs are at the very core of the most common applications of AI, natural language processing (NLP). There are far more real world applications of RNN-NLP than any other form of AI, including image recognition and processing with Convolutional Neural Nets (CNNs). In a sense, the army of data scientists has split off into two groups, each pursuing the separate applications that might be developed from these two techniques. In application there is essentially no overlap since image processing is about processing data that is static (even if only for a second) while RNN-NLP has always interpreted speech and text as time series data.