Fritzke, Bernd
Breathing $k$-Means
Fritzke, Bernd
We propose a new algorithm for the $k$-means problem which repeatedly increases and decreases the number of centroids by $m$ in order to find an approximate solution. New centroids are inserted in areas where they will likely reduce the error. The subsequent removal of centroids is done such that the resulting raise in error is small. After each increase or decrease step standard $k$-means is performed. Termination is guaranteed by decrementing $m$ after each increase/decrease cycle unless the overall error was lowered. In experiments with Gaussian mixture distributions the new algorithm produced on average solutions several percent better than $k$-means++.
A Growing Neural Gas Network Learns Topologies
Fritzke, Bernd
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast to previous approaches like the "neural gas" method of Martinetz and Schulten (1991, 1994), this model has no parameters which change over time and is able to continue learning, adding units and connections, until a performance criterion has been met. Applications of the model include vector quantization, clustering, and interpolation.
A Growing Neural Gas Network Learns Topologies
Fritzke, Bernd
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast to previous approaches like the "neural gas" method of Martinetz and Schulten (1991, 1994), this model has no parameters which change over time and is able to continue learning, adding units and connections, until a performance criterion has been met. Applications of the model include vector quantization, clustering, and interpolation.
Supervised Learning with Growing Cell Structures
Fritzke, Bernd
Center positions are continuously updated through soft competitive learning. The width of the radial basis functions is derived from the distance to topological neighbors. During the training the observed error is accumulated locally and used to determine where to insert the next unit. This leads (in case of classification problems) to the placement of units near class borders rather than near frequency peaks as is done by most existing methods. The resulting networks need few training epochs and seem to generalize very well. This is demonstrated by examples.
Supervised Learning with Growing Cell Structures
Fritzke, Bernd
Feed-forward networks of localized (e.g., Gaussian) units are an interesting alternative to the more frequently used networks of global (e.g., sigmoidal) units. It has been shown that with localized units one hidden layer suffices in principle to approximate any continuous function, whereas with sigmoidal units two layers are necessary. In the following we are considering radial basis function networks similar to those proposed by Moody & Darken (1989) or Poggio & Girosi (1990). Such networks consist of one layer L of Gaussian units.
Kohonen Feature Maps and Growing Cell Structures - a Performance Comparison
Fritzke, Bernd
A performance comparison of two self-organizing networks, the Kohonen Feature Map and the recently proposed Growing Cell Structures is made. For this purpose several performance criteria for self-organizing networks are proposed and motivated. The models are tested with three example problems of increasing difficulty. The Kohonen Feature Map demonstrates slightly superior results only for the simplest problem.
Kohonen Feature Maps and Growing Cell Structures - a Performance Comparison
Fritzke, Bernd
A performance comparison of two self-organizing networks, the Kohonen FeatureMap and the recently proposed Growing Cell Structures is made. For this purpose several performance criteria for self-organizing networks are proposed and motivated. The models are tested with three example problems of increasing difficulty. The Kohonen Feature Map demonstrates slightly superior results only for the simplest problem.