Goto

Collaborating Authors

 Machine Learning


High Order Neural Networks for Efficient Associative Memory Design

Neural Information Processing Systems

The designed networks exhibit the desired associative memory function: perfect storage and retrieval of pieces of information and/or sequences of information of any complexity. INTRODUCTION In the field of information processing, an important class of potential applications of neural networks arises from their ability to perform as associative memories. Since the publication of J. Hopfield's seminal paper1, investigations of the storage and retrieval properties of recurrent networks have led to a deep understanding of their properties. The basic limitations of these networks are the following: - their storage capacity is of the order of the number of neurons; - they are unable to handle structured problems; - they are unable to classify non-linearly separable data. American Institute of Physics 1988 234 In order to circumvent these limitations, one has to introduce additional non-linearities. This can be done either by using "hidden", nonlinear units, or by considering multi-neuron interactions2. This paper presents learning rules for networks with multiple interactions, allowing the storage and retrieval, either of static pieces of information (autoassociative memory), or of temporal sequences (associative memory), while preventing an explosive growth of the number of synaptic coefficients. AUTOASSOCIATIVEMEMORY The problem that will be addressed in this paragraph is how to design an autoassociative memory with a recurrent (or feedback) neural network when the number p of prototypes is large as compared to the number n of neurons. We consider a network of n binary neurons, operating in a synchronous mode, with period t.


Phasor Neural Networks

Neural Information Processing Systems

ABSTRACT A novel network type is introduced which uses unit-length 2-vectors for local variables. As an example of its applications, associative memory nets are defined and their performance analyzed. Real systems corresponding to such'phasor' models can be e.g. INTRODUCTION Most neural network models use either binary local variables or scalars combined with sigmoidal nonlinearities. Rather awkward coding schemes have to be invoked if one wants to maintain linear relations between the local signals being processed in e.g.


Programmable Synaptic Chip for Electronic Neural Networks

Neural Information Processing Systems

The matrix chip contains a programmable 32X32 array of "long channel" NMOSFET binary connection elements implemented ina 3-um bulk CMOS process. Since the neurons are kept offchip, the synaptic chip serves as a "cascadable" building block for a multi-chip synaptic network as large as 512X512 in size. As an alternative to the programmable NMOSFET (long channel) connection elements, tailored thin film resistors are deposited, in series with FET switches, on some CMOS test chips, to obtain the weak synaptic connections. Although deposition and patterning of the resistors require additional processing steps, they promise substantial savings in silcon area. The performance of a synaptic chip in a 32-neuron breadboard system in an associative memory test application is discussed. INTRODUCTION The highly parallel and distributive architecture of neural networks offers potential advantages in fault-tolerant and high speed associative information processing.


High-Level Connectionist Models

AI Magazine

A workshop on high-level connectionist models was held in Las Cruces, New Mexico, on 9-11 April 1988 with support from the Association for the Advancement of Artificial Intelligence and the Office of Naval Research. John Barnden and Jordan Pollack organized and hosted the workshop and will edit a book containing the proceedings and commentary. The book will be published by Ablex as the first volume in a series entitled Advances in Connectionist and Neural Computation Theory.


Uncertainty in Artificial Intelligence

AI Magazine

The Fourth Uncertainty in Artificial Intelligence workshop was held 19-21 August 1988. The workshop featured significant developments in application of theories of representation and reasoning under uncertainty. A recurring idea at the workshop was the need to examine uncertainty calculi in the context of choosing representation, inference, and control methodologies. The effectiveness of these choices in AI systems tends to be best considered in terms of specific problem areas. These areas include automated planning, temporal reasoning, computer vision, medical diagnosis, fault detection, text analysis, distributed systems, and behavior of nonlinear systems. Influence diagrams are emerging as a unifying representation, enabling tool development. Interest and results in uncertainty in AI are growing beyond the capacity of a workshop format.


Connectionism and Information Processing Abstractions

AI Magazine

Connectionism challenges a basic assumption of much of AI, that mental processes are best viewed as algorithmic symbol manipulations. Connectionism replaces symbol structures with distributed representations in the form of weights between units. For problems close to the architecture of the underlying machines, connectionist and symbolic approaches can make different representational commitments for a task and, thus, can constitute different theories. For complex problems, however, the power of a system comes more from the content of the representations than the medium in which the representations reside. The connectionist hope of using learning to obviate explicit specification of this content is undermined by the problem of programming appropriate initial connectionist architectures so that they can in fact learn. In essence, although connectionism is a useful corrective to the view of mind as a Turing machine, for most of the central issues of intelligence, connectionism is only marginally relevant.


Foundations and Grand Challenges of Artificial Intelligence: AAAI Presidential Address

AI Magazine

AAAI is a society devoted to supporting the progress in science, technology and applications of AI. I thought I would use this occasion to share with you some of my thoughts on the recent advances in AI, the insights and theoretical foundations that have emerged out of the past thirty years of stable, sustained, systematic explorations in our field, and the grand challenges motivating the research in our field.


Artificial Intelligence and Legal Reasoning: A Discussion of the Field and Gardner's Book

AI Magazine

In this article, I discuss the emerging field of artificial intelligence and legal reasoning and review the new book by Anne v.d.L. Gardner, An Artificial Intelligence Approach to Legal Reasoning, published by Bradford/MIT Press (1987, 225 pp., $22.50) as the first book in its new series on the subject.


AAAI News

AI Magazine

It was felt that the AIM Szolovits responding, and "Uncertainty approaches, augmenting explanations


New Mexico State University's Computing Research Laboratory

AI Magazine

The Computing Research Laboratory (CRL) at New Mexico State University is a center for research in artificial intelligence and cognitive science. Specific areas of research include the human-computer interface, natural language understanding, connectionism, knowledge representation and reasoning, computer vision, robotics, and graph theory. This article describes the ongoing projects at CRL.