Goto

Collaborating Authors



What makes Deep Learning deep....and world-changing?

#artificialintelligence

Why is deep learning called deep? It is because of the structure of those ANNs. Four decades back, neural networks were only two layers deep as it was not computationally feasible to build larger networks. Now, it is common to have neural networks with 10 layers and even 100 layer ANNs are being tried upon. Using multiple levels of neural networks in deep learning, computers now have the capacity to see, learn, and react to complex situations as well or better than humans. Normally data scientists spend a lot of time in data preparation – feature extraction or selecting variables which are actually useful to predictive analytics. Deep learning does this job automatically and makes life easier. To spur this development, many technology companies have made their deep learning libraries as open source, like Google's Tensorflow and Facebook's open source modules for Torch. Amazon released DSSTNE on GitHub, while Microsoft also released CNTK -- its open source deep learning toolkit -- on GitHub.


Lifted Relational Neural Networks: Efficient Learning of Latent Relational Structures

Journal of Artificial Intelligence Research

We propose a method to combine the interpretability and expressive power of firstorder logic with the effectiveness of neural network learning. In particular, we introduce a lifted framework in which first-order rules are used to describe the structure of a given problem setting. These rules are then used as a template for constructing a number of neural networks, one for each training and testing example. As the different networks corresponding to different examples share their weights, these weights can be efficiently learned using stochastic gradient descent. Our framework provides a flexible way for implementing and combining a wide variety of modelling constructs.


Capsule Networks: An Improvement to Convolutional Networks

#artificialintelligence

Siraj Raval covers the latest happenings on the latest research on neural networks, including a potential replacement to Convolutional Neural Networks.