Going deeper with recurrent networks: Sequence to Bag of Words Model


Now, with deep learning, we can convert unstructured text to computable formats, effectively incorporating semantic knowledge for training machine learning models. Recurrent neural network (RNN) is a network containing neural layers that have a temporal feedback loop. Running on an NVIDIA GPU gave us the computation power to blaze through 10 million job descriptions in 15 minutes (32 wide RNN and 24 wide pre-trained interest word vectors). We use Deep Learning to compute semantic embeddings for keywords and titles.