Deep Recursive Neural Networks for Compositionality in Language

Neural Information Processing Systems

Recursive neural networks comprise a class of architecture that can operate on structured input. They have been previously successfully applied to model compositionality in natural language using parse-tree-based structural representations. Even though these architectures are deep in structure, they lack the capacity for hierarchical representation that exists in conventional deep feed-forward networks as well as in recently investigated deep recurrent neural networks. In this work we introduce a new architecture --- a deep recursive neural network (deep RNN) --- constructed by stacking multiple recursive layers. We evaluate the proposed model on the task of fine-grained sentiment classification.


Feature Weight Tuning for Recursive Neural Networks

arXiv.org Artificial Intelligence

This paper addresses how a recursive neural network model can automatically leave out useless information and emphasize important evidence, in other words, to perform "weight tuning" for higher-level representation acquisition. We propose two models, Weighted Neural Network (WNN) and Binary-Expectation Neural Network (BENN), which automatically control how much one specific unit contributes to the higher-level representation. The proposed model can be viewed as incorporating a more powerful compositional function for embedding acquisition in recursive neural networks. Experimental results demonstrate the significant improvement over standard neural models.


Bidirectional Recursive Neural Networks for Token-Level Labeling with Structure

arXiv.org Machine Learning

Recently, deep architectures, such as recurrent and recursive neural networks have been successfully applied to various natural language processing tasks. Inspired by bidirectional recurrent neural networks which use representations that summarize the past and future around an instance, we propose a novel architecture that aims to capture the structural information around an input, and use it to label instances. We apply our method to the task of opinion expression extraction, where we employ the binary parse tree of a sentence as the structure, and word vector representations as the initial representation of a single token. We conduct preliminary experiments to investigate its performance and compare it to the sequential approach.


On Education Natural Language Processing with Deep Learning in Python - all courses

#artificialintelligence

Understand and implement word2vec Understand the CBOW method in word2vec Understand the skip-gram method in word2vec Understand the negative sampling optimization in word2vec Understand and implement GloVe using gradient descent and alternating least squares Use recurrent neural networks for parts-of-speech tagging Use recurrent neural networks for named entity recognition Understand and implement recursive neural networks for sentiment analysis Understand and implement recursive neural tensor networks for sentiment analysis Install Numpy, Matplotlib, Sci-Kit Learn, Theano, and TensorFlow (should be extremely easy by now) Understand backpropagation and gradient descent, be able to derive and code the equations on your own Code a recurrent neural network from basic primitives in Theano (or Tensorflow), especially the scan function Code a feedforward neural network in Theano (or Tensorflow) Helpful to have experience with tree algorithms In this course we are going to look at advanced NLP. Previously, you learned about some of the basics, like how many NLP problems are just regular machine learning and data science problems in disguise, and simple, practical methods like bag-of-words and term-document matrices. These allowed us to do some pretty cool things, like detect spam emails, write poetry, spin articles, and group together similar words. In this course I'm going to show you how to do even more awesome things. We'll learn not just 1, but 4 new architectures in this course.


On Education Natural Language Processing with Deep Learning in Python - all courses

#artificialintelligence

Understand and implement word2vec Understand the CBOW method in word2vec Understand the skip-gram method in word2vec Understand the negative sampling optimization in word2vec Understand and implement GloVe using gradient descent and alternating least squares Use recurrent neural networks for parts-of-speech tagging Use recurrent neural networks for named entity recognition Understand and implement recursive neural networks for sentiment analysis Understand and implement recursive neural tensor networks for sentiment analysis Install Numpy, Matplotlib, Sci-Kit Learn, Theano, and TensorFlow (should be extremely easy by now) Understand backpropagation and gradient descent, be able to derive and code the equations on your own Code a recurrent neural network from basic primitives in Theano (or Tensorflow), especially the scan function Code a feedforward neural network in Theano (or Tensorflow) Helpful to have experience with tree algorithms In this course we are going to look at advanced NLP. Previously, you learned about some of the basics, like how many NLP problems are just regular machine learning and data science problems in disguise, and simple, practical methods like bag-of-words and term-document matrices. These allowed us to do some pretty cool things, like detect spam emails, write poetry, spin articles, and group together similar words. In this course I'm going to show you how to do even more awesome things. We'll learn not just 1, but 4 new architectures in this course.