Evolutionary Systems
Stochastic Hillclimbing as a Baseline Method for Evaluating Genetic Algorithms
Juels, Ari, Wattenberg, Martin
We investigate the effectiveness of stochastic hillclimbing as a baseline for evaluating the performance of genetic algorithms (GAs) as combinatorial function optimizers. In particular, we address two problems to which GAs have been applied in the literature: Koza's ll-multiplexer problem and the jobshop problem. We demonstrate that simple stochastic hillclimbing methods are able to achieve results comparable or superior to those obtained by the GAs designed to address these two problems. We further illustrate, in the case of the jobshop problem, how insights obtained in the formulation of a stochastic hillclimbing algorithm can lead to improvements in the encoding used by a GA.
The 1995 Fall Symposia Series
Cohn, David, Lewis, David, Aha, David W., Burke, Robin, Srihari, Rohini K., Horswill, Ian, Buvac, Sasa, Siegel, Eric V., Fehling, Michael
The Association for the Advancement of Artificial Intelligence (AAAI) held its 1995 Fall Symposia Series on 10 to 12 November in Cambridge, Massachusetts. This article contains summaries of the eight symposia that were conducted: (1) Active Learning; (2) Adaptation of Knowledge for Reuse; (3) AI Applications in Knowledge Navigation and Retrieval; (4) Computational Models for Integrating Language and Vision; (5) Embodied Language and Action Symposium; (6) Formalizing Context; (7) Genetic Programming; and (8) Rational Agency: Concepts, Theories, Models, and Applications.
Structural and Behavioral Evolution of Recurrent Networks
Saunders, Gregory M., Angeline, Peter J., Pollack, Jordan B.
This paper introduces GNARL, an evolutionary program which induces recurrent neural networks that are structurally unconstrained. In contrast to constructive and destructive algorithms, GNARL employs a population of networks and uses a fitness function's unsupervised feedback to guide search through network space. Annealing is used in generating both gaussian weight changes and structural modifications. Applying GNARL to a complex search and collection task demonstrates that the system is capable of inducing networks with complex internal dynamics.
Structural and Behavioral Evolution of Recurrent Networks
Saunders, Gregory M., Angeline, Peter J., Pollack, Jordan B.
This paper introduces GNARL, an evolutionary program which induces recurrent neural networks that are structurally unconstrained. In contrast to constructive and destructive algorithms, GNARL employs a population ofnetworks and uses a fitness function's unsupervised feedback to guide search through network space. Annealing is used in generating both gaussian weight changes and structural modifications. Applying GNARL to a complex search and collection task demonstrates that the system is capable of inducing networks with complex internal dynamics.
When will a Genetic Algorithm Outperform Hill Climbing
Mitchell, Melanie, Holland, John H., Forrest, Stephanie
HoUand Dept. of Psychology University of Michigan Ann Arbor, MI 48109 StephanieForrest Dept. of Computer Science University of New Mexico Albuquerque, NM 87131 Abstract We analyze a simple hill-climbing algorithm (RMHC) that was previously shownto outperform a genetic algorithm (GA) on a simple "Royal Road" function. We then analyze an "idealized" genetic algorithm (IGA) that is significantly faster than RMHC and that gives a lower bound for GA speed. We identify the features of the IGA that give rise to this speedup, and discuss how these features can be incorporated into a real GA. 1 INTRODUCTION Our goal is to understand the class of problems for which genetic algorithms (GA) are most suited, and in particular, for which they will outperform other search algorithms. Several studies have empirically compared GAs with other search and optimization methods such as simple hill-climbing (e.g., Davis, 1991), simulated annealing (e.g., Ingber & Rosen, 1992), linear, nonlinear, and integer programming techniques, and other traditional optimization techniques (e.g., De Jong, 1975). However, such comparisons typically compare one version of the GA with a second algorithm on a single problem or set of problems, often using performance criteria which may not be appropriate.
Structural and Behavioral Evolution of Recurrent Networks
Saunders, Gregory M., Angeline, Peter J., Pollack, Jordan B.
This paper introduces GNARL, an evolutionary program which induces recurrent neural networks that are structurally unconstrained. In contrast to constructive and destructive algorithms, GNARL employs a population of networks and uses a fitness function's unsupervised feedback to guide search through network space. Annealing is used in generating both gaussian weight changes and structural modifications. Applying GNARL to a complex search and collection task demonstrates that the system is capable of inducing networks with complex internal dynamics.
When will a Genetic Algorithm Outperform Hill Climbing
Mitchell, Melanie, Holland, John H., Forrest, Stephanie
We analyze a simple hill-climbing algorithm (RMHC) that was previously shown to outperform a genetic algorithm (GA) on a simple "Royal Road" function. We then analyze an "idealized" genetic algorithm (IGA) that is significantly faster than RMHC and that gives a lower bound for GA speed. We identify the features of the IGA that give rise to this speedup, and discuss how these features can be incorporated into a real GA. 1 INTRODUCTION Our goal is to understand the class of problems for which genetic algorithms (GA) are most suited, and in particular, for which they will outperform other search algorithms. Several studies have empirically compared GAs with other search and optimization methods such as simple hill-climbing (e.g., Davis, 1991), simulated annealing (e.g., Ingber & Rosen, 1992), linear, nonlinear, and integer programming techniques, and other traditional optimization techniques (e.g., De Jong, 1975). However, such comparisons typically compare one version of the GA with a second algorithm on a single problem or set of problems, often using performance criteria which may not be appropriate.
The Fifth International Conference on Genetic Algorithms
The Fifth International Conference on Genetic Algorithms was held at the University of Illinois at Urbana-Champaign from 17-21 July 1993. Approximately 350 participants attended the multitrack conference, which covered a wide range of topics, including genetic operators, mathematical analysis of genetic algorithms, parallel genetic algorithms, classifier systems, and genetic programming. This article highlights the major themes of the conference by discussing a few papers in detail.
The Fifth International Conference on Genetic Algorithms
The Fifth International Conference on Genetic Algorithms was held at the University of Illinois at Urbana-Champaign from 17-21 July 1993. Approximately 350 participants attended the multitrack conference, which covered a wide range of topics, including genetic operators, mathematical analysis of genetic algorithms, parallel genetic algorithms, classifier systems, and genetic programming. This article highlights the major themes of the conference by discussing a few papers in detail.