Goto

Collaborating Authors

 Evolutionary Systems


Interposing an ontogenetic model between Genetic Algorithms and Neural Networks

Neural Information Processing Systems

The relationships between learning, development and evolution in Nature is taken seriously, to suggest a model of the developmental process whereby the genotypes manipulated by the Genetic Algorithm (GA) might be expressed to form phenotypic neural networks (NNet) that then go on to learn. ONTOL is a grammar for generating polynomial NN ets for time-series prediction. Genomes correspond to an ordered sequence of ONTOL productions and define a grammar that is expressed to generate a NNet. The NNet's weights are then modified by learning, and the individual's prediction error is used to determine GA fitness. A new gene doubling operator appears critical to the formation of new genetic alternatives in the preliminary but encouraging results presented.


Interposing an ontogenetic model between Genetic Algorithms and Neural Networks

Neural Information Processing Systems

The relationships between learning, development and evolution in Nature is taken seriously, to suggest a model of the developmental process whereby the genotypes manipulated by the Genetic Algorithm (GA) might be expressed to form phenotypic neural networks (NNet) that then go on to learn. ONTOL is a grammar for generating polynomial NN ets for time-series prediction. Genomes correspond to an ordered sequence of ONTOL productions and define a grammar that is expressed to generate a NNet. The NNet's weights are then modified by learning, and the individual's prediction error is used to determine GA fitness. A new gene doubling operator appears critical to the formation of new genetic alternatives in the preliminary but encouraging results presented.


Interposing an ontogenetic model between Genetic Algorithms and Neural Networks

Neural Information Processing Systems

The relationships between learning, development and evolution in Nature is taken seriously, to suggest a model of the developmental process whereby the genotypes manipulated by the Genetic Algorithm (GA)might be expressed to form phenotypic neural networks (NNet) that then go on to learn. ONTOL is a grammar for generating polynomialNNets for time-series prediction. Genomes correspond toan ordered sequence of ONTOL productions and define a grammar that is expressed to generate a NNet. The NNet's weights are then modified by learning, and the individual's prediction error is used to determine GA fitness. A new gene doubling operator appears critical to the formation of new genetic alternatives in the preliminary but encouraging results presented.



Using Genetic Algorithms to Improve Pattern Classification Performance

Neural Information Processing Systems

Feature selection and creation are two of the most important and difficult tasks in the field of pattern classification. Good features improve the performance of both conventional and neural network pattern classifiers. Exemplar selection is another task that can reduce the memory and computation requirements of a KNN classifier.



Interactions between learning and evolution

Classics

In Langton, C., Taylor, C., Farmer, J. D., and Ramussen, S. (Eds.), Artificial Life II, pp. 487โ€“509. Addison-Wesley


Predicting Weather Using a Genetic Memory: A Combination of Kanerva's Sparse Distributed Memory with Holland's Genetic Algorithms

Neural Information Processing Systems

Kanerva's sparse distributed memory (SDM) is an associative-memory modelbased on the mathematical properties of high-dimensional binary address spaces. Holland's genetic algorithms are a search technique forhigh-dimensional spaces inspired by evolutionary processes of DNA. "Genetic Memory" is a hybrid of the above two systems, in which the memory uses a genetic algorithm to dynamically reconfigure itsphysical storage locations to reflect correlations between the stored addresses and data. For example, when presented with raw weather station data, the Genetic Memory discovers specific features inthe weather data which correlate well with upcoming rain, and reconfigures the memory to utilize this information effectively. This architecture is designed to maximize the ability of the system to scale-up to handle real-world problems.


Predicting Weather Using a Genetic Memory: A Combination of Kanerva's Sparse Distributed Memory with Holland's Genetic Algorithms

Neural Information Processing Systems

Kanerva's sparse distributed memory (SDM) is an associative-memory model based on the mathematical properties of high-dimensional binary address spaces. Holland's genetic algorithms are a search technique for high-dimensional spaces inspired by evolutionary processes of DNA. "Genetic Memory" is a hybrid of the above two systems, in which the memory uses a genetic algorithm to dynamically reconfigure its physical storage locations to reflect correlations between the stored addresses and data. For example, when presented with raw weather station data, the Genetic Memory discovers specific features in the weather data which correlate well with upcoming rain, and reconfigures the memory to utilize this information effectively. This architecture is designed to maximize the ability of the system to scale-up to handle real-world problems.


Statistical Prediction with Kanerva's Sparse Distributed Memory

Neural Information Processing Systems

ABSTRACT A new viewpoint of the processing performed by Kanerva's sparse distributed memory (SDM) is presented. In conditions of near-or over-capacity, where the associative-memory behavior of the model breaks down, the processing performed by the model can be interpreted as that of a statistical predictor. Mathematical results are presented which serve as the framework for a new statistical viewpoint of sparse distributed memory and for which the standard formulation of SDM is a special case. This viewpoint suggests possible enhancements to the SDM model, including a procedure for improving the predictiveness of the system based on Holland's work with'Genetic Algorithms', and a method for improving the capacity of SDM even when used as an associative memory. OVERVIEW This work is the result of studies involving two seemingly separate topics that proved to share a common framework. The fIrst topic, statistical prediction, is the task of associating extremely large perceptual state vectors with future events.