A news-analysis NeuralNet learns from a language NeuralNet

#artificialintelligence

A common way to solve a complex computing task is to chain together specialized components. In data-science this is the pipeline approach. Each component mostly treats the other components as I/O black-boxes. As developers we potentially have the full picture but the system does not. With Neural Network what happens between I and O is often too interesting to be ignored.


Understanding neural networks with TensorFlow Playground Google Cloud Big Data and Machine Learning Blog Google Cloud Platform

#artificialintelligence

This is how simple neurons get smarter and perform so well for certain problems such as image recognition and playing Go. Inception: an image recognition model published by Google (From: Going deeper with convolutions, Christian Szegedy et al.) Some published examples of visualization by deep networks show how they're trained to build the hierarchy of recognized patterns, from simple edges and blobs to object parts and classes. In this article, we looked at some TensorFlow Playground demos and how they explain the mechanism and power of neural networks. As you've seen, the basics of the technology are pretty simple.


Using TensorFlow / machine learning for automated RF side-channel attack classification

#artificialintelligence

The idea was born to use TensorFlow/machine learning to automatically analyze these signals and using it to retrieve the PIN entered into the device - out of thin air! The setup for finding and recording such a signal can range from very simple up to very complex, for this case everything was done using Software Defined Radios. A cheap RTL-SDR receiver is available for roughly $30, though a more sophisticated device such as a HackRF or a bladeRF offer significantly higher sample rates (and a higher ADC resolution). Even with this cheap setup, the signal could be picked up from more than 2 meters (6.5 feet) away - using a directional antenna (and maybe using emissions on a different frequency band) this range can be easily increased. It was also found that connecting the USB cable to the device increases the measured strength of the emissions significantly.


A simple neural network with Python and Keras - PyImageSearch

@machinelearnbot

If you've been following along with this series of blog posts, then you already know what a huge fan I am of Keras. Keras is a super powerful, easy to use Python library for building neural networks and deep learning networks. In the remainder of this blog post, I'll demonstrate how to build a simple neural network using Python and Keras, and then apply it to the task of image classification. To start this post, we'll quickly review the most common neural network architecture -- feedforward networks. We'll then discuss our project structure followed by writing some Python code to define our feedforward neural network and specifically apply it to the Kaggle Dogs vs. Cats classification challenge.


Teaching a neural network to use a calculator

#artificialintelligence

This article explores a seq2seq architecture for solving simple probability problems in Saxton et. A transformer is used to map questions to intermediate steps, while an external symbolic calculator evaluates intermediate expressions. This approach emulates how a student might solve math problems, by setting up intermediate equations, using a calculator to solve them, and using those results to construct further equations. A few months ago, DeepMind released Mathematics Dataset, a codebase for procedurally generating pairs of mathematics questions and answers, to serve as a benchmark for the ability of modern neural architectures to learn mathematical reasoning. The data consists of a wide variety of categories, ranging from basic arithmetic to probability. Both questions and answers are in the form of free-form text, making seq2seq models a natural first step for solving this dataset.