Goto

Collaborating Authors

GitHub - oxford-cs-deepnlp-2017/lectures: Oxford Deep NLP 2017 course

#artificialintelligence

This repository contains the lecture slides and course description for the Deep Natural Language Processing course offered in Hilary Term 2017 at the University of Oxford. This is an advanced course on natural language processing. Automatically processing natural language inputs and producing language outputs is a key component of Artificial General Intelligence. The ambiguities and noise inherent in human communication render traditional symbolic AI techniques ineffective for representing and analysing language data. This is an applied course focussing on recent advances in analysing and generating speech and text using recurrent neural networks.


oxford-cs-deepnlp-2017/lectures

#artificialintelligence

This repository contains the lecture slides and course description for the Deep Natural Language Processing course offered in Hilary Term 2017 at the University of Oxford. This is an advanced course on natural language processing. Automatically processing natural language inputs and producing language outputs is a key component of Artificial General Intelligence. The ambiguities and noise inherent in human communication render traditional symbolic AI techniques ineffective for representing and analysing language data. This is an applied course focussing on recent advances in analysing and generating speech and text using recurrent neural networks.


Lecture 1 Natural Language Processing with Deep Learning

@machinelearnbot

Lecture 1 introduces the concept of Natural Language Processing (NLP) and the problems NLP faces today. The concept of representing words as numeric vectors is then introduced, and popular approaches to designing word vectors are discussed. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component. For additional learning opportunities please visit: http://stanfordonline.stanford.edu/


Lecture 17: Issues in NLP and Possible Architectures for NLP

@machinelearnbot

Lecture 17 looks at solving language, efficient tree-recursive models SPINN and SNLI, as well as research highlight "Learning to compose for QA." Also covered are interlude pointer/copying models and sub-word and character-based models. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component. For additional learning opportunities please visit: http://stanfordonline.stanford.edu/


Deep Learning for NLP at Oxford with Deep Mind 2017 - YouTube

@machinelearnbot

This playlist contains the lecture videos for the Deep Natural Language Processing course offered in Hilary Term 2017 at the University of Oxford. This is an advanced course on natural language processing. Automatically processing natural language inputs and producing language outputs is a key component of Artificial General Intelligence. The ambiguities and noise inherent in human communication render traditional symbolic AI techniques ineffective for representing and analysing language data. Recently statistical techniques based on neural networks have achieved a number of remarkable successes in natural language processing leading to a great deal of commercial and academic interest in the field This is an applied course focusing on recent advances in analysing and generating speech and text using recurrent neural networks.