Goto

Collaborating Authors

 Miikkulainen, Risto


Intrusion Detection with Neural Networks

Neural Information Processing Systems

Intrusion detection schemes can be classified into two categories: misuse and anomaly intrusion detection. Misuse refers to known attacks that exploit the known vulnerabilities of the system. Anomaly means unusual activity in general that could indicate an intrusion.


Laterally Interconnected Self-Organizing Maps in Hand-Written Digit Recognition

Neural Information Processing Systems

An application of laterally interconnected self-organizing maps (LISSOM) to handwritten digit recognition is presented. The resulting excitatory connections focus the activity into local patches and the inhibitory connections decorrelate redundant activityon the map. The map thus forms internal representations thatare easy to recognize with e.g. a perceptron network. The recognition rate on a subset of NIST database 3 is 4.0% higher with LISSOM than with a regular Self-Organizing Map (SOM) as the front end, and 15.8% higher than recognition of raw input bitmaps directly. These results form a promising starting point for building pattern recognition systems with a LISSOM map as a front end. 1 Introduction Handwritten digit recognition has become one of the touchstone problems in neural networks recently.


Laterally Interconnected Self-Organizing Maps in Hand-Written Digit Recognition

Neural Information Processing Systems

The lateral connections learn the correlations of activity between units on the map. The resulting excitatory connections focus the activity into local patches and the inhibitory connections decorrelate redundant activity on the map. The map thus forms internal representations that are easy to recognize with e.g. a perceptron network. The recognition rate on a subset of NIST database 3 is 4.0% higher with LISSOM than with a regular Self-Organizing Map (SOM) as the front end, and 15.8% higher than recognition of raw input bitmaps directly. These results form a promising starting point for building pattern recognition systems with a LISSOM map as a front end. 1 Introduction Handwritten digit recognition has become one of the touchstone problems in neural networks recently.



SARDNET: A Self-Organizing Feature Map for Sequences

Neural Information Processing Systems

A self-organizing neural network for sequence classification called SARDNET is described and analyzed experimentally. SARDNET extends the Kohonen Feature Map architecture with activation retention anddecay in order to create unique distributed response patterns for different sequences. SARDNET yields extremely dense yet descriptive representations of sequential input in very few training iterations.The network has proven successful on mapping arbitrary sequencesof binary and real numbers, as well as phonemic representations of English words. Potential applications include isolated spoken word recognition and cognitive science models of sequence processing. 1 INTRODUCTION While neural networks have proved a good tool for processing static patterns, classifying sequentialinformation has remained a challenging task. The problem involves recognizing patterns in a time series of vectors, which requires forming a good internal representationfor the sequences. Several researchers have proposed extending the self-organizing feature map (Kohonen 1989, 1990), a highly successful static pattern classification method, to sequential information (Kangas 1991; Samarabandu andJakubowicz 1990; Scholtes 1991). Below, three of the most recent of these networks are briefly described. The remainder of the paper focuses on a new architecture designed to overcome the shortcomings of these approaches.


SARDNET: A Self-Organizing Feature Map for Sequences

Neural Information Processing Systems

A self-organizing neural network for sequence classification called SARDNET is described and analyzed experimentally. SARDNET extends the Kohonen Feature Map architecture with activation retention and decay in order to create unique distributed response patterns for different sequences.