network and conventional pattern classifier
Practical Characteristics of Neural Network and Conventional Pattern Classifiers on Artificial and Speech Problems
Eight neural net and conventional pattern classifiers (Bayesian(cid:173) unimodal Gaussian, k-nearest neighbor, standard back-propagation, adaptive-stepsize back-propagation, hypersphere, feature-map, learn(cid:173) ing vector quantizer, and binary decision tree) were implemented on a serial computer and compared using two speech recognition and two artificial tasks. Error rates were statistically equivalent on almost all tasks, but classifiers differed by orders of magnitude in memory requirements, training time, classification time, and ease of adaptivity. Nearest-neighbor classifiers trained rapidly but re(cid:173) quired the most memory. Tree classifiers provided rapid classifica(cid:173) tion but were complex to adapt. Back-propagation classifiers typ(cid:173) ically required long training times and had intermediate memory requirements.
A Comparative Study of the Practical Characteristics of Neural Network and Conventional Pattern Classifiers
Seven different pattern classifiers were implemented on a serial computer and compared using artificial and speech recognition tasks. Two neural network (radial basis function and high order polynomial GMDH network) and five conventional classifiers (Gaussian mixture, linear tree, K nearest neighbor, KD-tree, and condensed K nearest neighbor) were evaluated. Classifiers were chosen to be representative of different approaches to pat(cid:173) tern classification and to complement and extend those evaluated in a previous study (Lee and Lippmann, 1989). This and the previous study both demonstrate that classification error rates can be equivalent across different classifiers when they are powerful enough to form minimum er(cid:173) ror decision regions, when they are properly tuned, and when sufficient training data is available. Practical characteristics such as training time, classification time, and memory requirements, however, can differ by or(cid:173) ders of magnitude.
A Comparative Study of the Practical Characteristics of Neural Network and Conventional Pattern Classifiers
Ng, Kenney, Lippmann, Richard P.
Seven different neural network and conventional pattern classifiers were compared using artificial and speech recognition tasks. High order polynomial GMDH classifiers typically provided intermediate error rates and often required long training times and large amounts of memory. In addition, the decision regions formed did not generalize well to regions of the input space with little training data. Radial basis function classifiers generalized well in high dimensional spaces, and provided low error rates with training times that were much less than those of back-propagation classifiers (Lee and Lippmann, 1989). Gaussian mixture classifiers provided good performance when the numbers and types of mixtures were selected carefully to model class densities well. Linear tree classifiers were the most computationally ef- 976 Ng and Lippmann ficient but performed poorly with high dimensionality inputs and when the number of training patterns was small. KD-tree classifiers reduced classification time by a factor of four over conventional KNN classifiers for low 2-input dimension problems. They provided little or no reduction in classification time for high 22-input dimension problems. Improved condensed KNN classifiers reduced memory requirements over conventional KNN classifiers by a factor of two to fifteen for all problems, without increasing the error rate significantly.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.05)
- North America > United States > Massachusetts > Middlesex County > Lexington (0.04)
A Comparative Study of the Practical Characteristics of Neural Network and Conventional Pattern Classifiers
Ng, Kenney, Lippmann, Richard P.
Seven different neural network and conventional pattern classifiers were compared using artificial and speech recognition tasks. High order polynomial GMDH classifiers typically provided intermediate error rates and often required long training times and large amounts of memory. In addition, the decision regions formed did not generalize well to regions of the input space with little training data. Radial basis function classifiers generalized well in high dimensional spaces, and provided low error rates with training times that were much less than those of back-propagation classifiers (Lee and Lippmann, 1989). Gaussian mixture classifiers provided good performance when the numbers and types of mixtures were selected carefully to model class densities well. Linear tree classifiers were the most computationally ef- 976 Ng and Lippmann ficient but performed poorly with high dimensionality inputs and when the number of training patterns was small. KD-tree classifiers reduced classification time by a factor of four over conventional KNN classifiers for low 2-input dimension problems. They provided little or no reduction in classification time for high 22-input dimension problems. Improved condensed KNN classifiers reduced memory requirements over conventional KNN classifiers by a factor of two to fifteen for all problems, without increasing the error rate significantly.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.05)
- North America > United States > Massachusetts > Middlesex County > Lexington (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.05)
- North America > United States > Massachusetts > Middlesex County > Lexington (0.04)
Practical Characteristics of Neural Network and Conventional Pattern Classifiers on Artificial and Speech Problems
Lee, Yuchun, Lippmann, Richard P.
Eight neural net and conventional pattern classifiers (Bayesianunimodal Gaussian, k-nearest neighbor, standard back-propagation, adaptive-stepsize back-propagation, hypersphere, feature-map, learning vector quantizer, and binary decision tree) were implemented on a serial computer and compared using two speech recognition and two artificial tasks. Error rates were statistically equivalent on almost all tasks, but classifiers differed by orders of magnitude in memory requirements, training time, classification time, and ease of adaptivity. Nearest-neighbor classifiers trained rapidly but required the most memory. Tree classifiers provided rapid classification but were complex to adapt. Back-propagation classifiers typically required long training times and had intermediate memory requirements. These results suggest that classifier selection should often depend more heavily on practical considerations concerning memory and computation resources, and restrictions on training and classification times than on error rate.
- North America > United States > New York (0.05)
- North America > United States > Virginia > Fairfax County > Reston (0.04)
- North America > United States > Massachusetts > Middlesex County > Lexington (0.04)
- (2 more...)
Practical Characteristics of Neural Network and Conventional Pattern Classifiers on Artificial and Speech Problems
Lee, Yuchun, Lippmann, Richard P.
Eight neural net and conventional pattern classifiers (Bayesianunimodal Gaussian, k-nearest neighbor, standard back-propagation, adaptive-stepsize back-propagation, hypersphere, feature-map, learning vector quantizer, and binary decision tree) were implemented on a serial computer and compared using two speech recognition and two artificial tasks. Error rates were statistically equivalent on almost all tasks, but classifiers differed by orders of magnitude in memory requirements, training time, classification time, and ease of adaptivity. Nearest-neighbor classifiers trained rapidly but required the most memory. Tree classifiers provided rapid classification but were complex to adapt. Back-propagation classifiers typically required long training times and had intermediate memory requirements. These results suggest that classifier selection should often depend more heavily on practical considerations concerning memory and computation resources, and restrictions on training and classification times than on error rate.
- North America > United States > New York (0.05)
- North America > United States > Virginia > Fairfax County > Reston (0.04)
- North America > United States > Massachusetts > Middlesex County > Lexington (0.04)
- (2 more...)
Practical Characteristics of Neural Network and Conventional Pattern Classifiers on Artificial and Speech Problems
Lee, Yuchun, Lippmann, Richard P.
Eight neural net and conventional pattern classifiers (Bayesianunimodal Gaussian,k-nearest neighbor, standard back-propagation, adaptive-stepsize back-propagation, hypersphere, feature-map, learning vectorquantizer, and binary decision tree) were implemented on a serial computer and compared using two speech recognition and two artificial tasks. Error rates were statistically equivalent on almost all tasks, but classifiers differed by orders of magnitude in memory requirements, training time, classification time, and ease of adaptivity. Nearest-neighbor classifiers trained rapidly but required themost memory. Tree classifiers provided rapid classification but were complex to adapt. Back-propagation classifiers typically requiredlong training times and had intermediate memory requirements. These results suggest that classifier selection should often depend more heavily on practical considerations concerning memory and computation resources, and restrictions on training and classification times than on error rate.
- North America > United States > New York (0.05)
- North America > United States > Virginia > Fairfax County > Reston (0.04)
- North America > United States > Massachusetts > Middlesex County > Lexington (0.04)
- (2 more...)