Goto

Collaborating Authors


A History of Deep Learning - Import.io

#artificialintelligence

These days, you hear a lot about machine learning (or ML) and artificial intelligence (or AI) – both good or bad depending on your source. Many of us immediately conjure up images of HAL from 2001: A Space Odyssey, the Terminator cyborgs, C-3PO, or Samantha from Her when the subject turns to AI. And many may not even be familiar with machine learning as a separate subject. The phrases are often tossed around interchangeably, but they're not exactly the same thing. In the most general sense, machine learning has evolved from AI. In the Google Trends graph above, you can see that AI was the more popular search term until machine learning passed it for good around September 2015.



Deep Learning: Convolutional Neural Networks in Python

#artificialintelligence

This is the 3rd part in my Data Science and Machine Learning series on Deep Learning in Python. At this point, you already know a lot about neural networks and deep learning, including not just the basics like backpropagation, but how to improve it using modern techniques like momentum and adaptive learning rates. You've already written deep neural networks in Theano and TensorFlow, and you know how to run code using the GPU. This course is all about how to use deep learning for computer vision using convolutional neural networks. These are the state of the art when it comes to image classification and they beat vanilla deep networks at tasks like MNIST.


Narodytska

AAAI Conferences

Understanding properties of deep neural networks is an important challenge in deep learning. In this paper, we take a step in this direction by proposing a rigorous way of verifying properties of a popular class of neural networks, Binarized Neural Networks, using the well-developed means of Boolean satisfiability. Our main contribution is a construction that creates a representation of a binarized neural network as a Boolean formula. Our encoding is the first exact Boolean representation of a deep neural network. Using this encoding, we leverage the power of modern SAT solvers along with a proposed counterexample-guided search procedure to verify various properties of these networks. A particular focus will be on the critical property of robustness to adversarial perturbations. For this property, our experimental results demonstrate that our approach scales to medium-size deep neural networks used in image classification tasks. To the best of our knowledge, this is the first work on verifying properties of deep neural networks using an exact Boolean encoding of the network.