Goto

Collaborating Authors

 Jackel, Lawrence D.


A Neurocomputer Board Based on the ANNA Neural Network Chip

Neural Information Processing Systems

Many researchers have built neural-network chips, but few chips have been installed in board-level systems, even though this next level of integration provides insights and advantages that can't be attained on a chip testing station. Building a board demonstrates whether or not the chip can be effectively integrated into the larger systems required for real applications.


A Neurocomputer Board Based on the ANNA Neural Network Chip

Neural Information Processing Systems

Many researchers have built neural-network chips, but few chips have been installed in board-level systems, even though this next level of integration provides insights and advantages that can't be attained on a chip testing station. Building a board demonstrates whether or not the chip can be effectively integrated into the larger systems required for real applications.


Handwritten Digit Recognition with a Back-Propagation Network

Neural Information Processing Systems

We present an application of back-propagation networks to handwritten digitrecognition. Minimal preprocessing of the data was required, but architecture of the network was highly constrained and specifically designed for the task. The input of the network consists of normalized images of isolated digits. The method has 1 % error rate and about a 9% reject rate on zipcode digits provided by the U.S. Postal Service. 1 INTRODUCTION The main point of this paper is to show that large back-propagation (BP) networks canbe applied to real image-recognition problems without a large, complex preprocessing stage requiring detailed engineering. Unlike most previous work on the subject (Denker et al., 1989), the learning network is directly fed with images, rather than feature vectors, thus demonstrating the ability of BP networks to deal with large amounts of low level information. Previous work performed on simple digit images (Le Cun, 1989) showed that the architecture of the network strongly influences the network's generalization ability. Good generalization can only be obtained by designing a network architecture that contains a certain amount of a priori knowledge about the problem. The basic design principleis to minimize the number of free parameters that must be determined by the learning algorithm, without overly reducing the computational power of the network.


Handwritten Digit Recognition with a Back-Propagation Network

Neural Information Processing Systems

We present an application of back-propagation networks to handwritten digit recognition. Minimal preprocessing of the data was required, but architecture of the network was highly constrained and specifically designed for the task. The input of the network consists of normalized images of isolated digits. The method has 1 % error rate and about a 9% reject rate on zipcode digits provided by the U.S. Postal Service. 1 INTRODUCTION The main point of this paper is to show that large back-propagation (BP) networks can be applied to real image-recognition problems without a large, complex preprocessing stage requiring detailed engineering. Unlike most previous work on the subject (Denker et al., 1989), the learning network is directly fed with images, rather than feature vectors, thus demonstrating the ability of BP networks to deal with large amounts of low level information. Previous work performed on simple digit images (Le Cun, 1989) showed that the architecture of the network strongly influences the network's generalization ability. Good generalization can only be obtained by designing a network architecture that contains a certain amount of a priori knowledge about the problem. The basic design principle is to minimize the number of free parameters that must be determined by the learning algorithm, without overly reducing the computational power of the network.