Goto

Collaborating Authors

Neural Networks Art: Solving Problems with Multiple Solutions and New Teaching Algorithm

#artificialintelligence

The human brain processes information flows continuously from the external environment. However, it can modify and update the stored images, and create new, without destroying what previously memorized. Thus it differs significantly from the majority of neural networks as neural networks (NN), trained by back propagation, genetic algorithms, in bidirectional associative memory, Hopfield networks, etc. very often a new way of learning, situation or association significantly distorts or even destroys the fruits of prior learning, requiring a change in a significant part of weights of connections or complete ret raining of the network [1-4]. Impossibility of using the specified NN solve the problem of stability-plasticity, that is a problem of perception and memorization of new information without loss or distortion of existing, was one of the main reasons for the development of fundamentally new configurations of neural networks. Examples of such networks are neural networks, derived from the adaptive resonance theory (ART), developed by Carpenter and Grossberg [5, 6].


Markov Logic: An Interface Layer for Artificial Intelligence

Morgan & Claypool Publishers

Most subfields of computer science have an interface layer via which applications communicate with the infrastructure, and this is key to their success (e.g., the Internet in networking, the relational model in databases, etc.). So far this interface layer has been missing in AI. This book discusses Markov logic, a powerful language that has been successfully applied as an interface layer. ISBN 9781598296921, 155 pages.


R Interface to the Keras Deep Learning Library

#artificialintelligence

Building a model in Keras starts by constructing an empty Sequential model. The result of Sequential, as with most of the functions provided by kerasR, is a python.builtin.object. This object type, defined from the reticulate package, provides direct access to all of the methods and attributes exposed by the underlying python class. To access these, we use the $ operator followed by the method name. Layers are added by calling the method add.


Interpretability via attentional and memory-based interfaces, using TensorFlow

#artificialintelligence

This article is a gentle introduction to attentional and memory-based interfaces in deep neural architectures, using TensorFlow. Incorporating attention mechanisms is very simple and can offer transparency and interpretability to our complex models. We conclude with extensions and caveats of the interfaces. As you read the article, please access all of the code on GitHub and view the IPython notebook here; all code is compatible with TensorFlow version 1.0. The intended audience for this notebook are developers and researchers who have some basic understanding of TensorFlow and fundamental deep learning concepts.


Disruptive Interfaces & The Emerging Battle To Be The Default

#artificialintelligence

A new battle is brewing to be the default of every choice we make. As modern interfaces like voice remove options, augmented reality overlays our physical world, and artificial intelligence gains our trust by transcending our own reasoning, DEFAULTS WILL RULE THE WORLD. I've come to call them disruptive interfaces -- drastically simpler and more accessible interfaces that ultimately commoditize everything underneath. Once powerful companies that have invested millions or billions in their brands, achieved dominance through network effects, or compete with sophisticated supply chains are vulnerable to losing their pricing power, differentiation, and being all-together excluded from the moment where customers make decisions. In 2014, I shared some thoughts on how "the interface layer" would commoditize much of the technology underneath.