Stagnation in Evolutionary Algorithms: Convergence $\neq$ Optimality
–arXiv.org Artificial Intelligence
Stagnation refers to the situation where the best solution found so far remains unchanged over time, which is a common phenomenon in evolutionary computation, as most evolutionary algorithms are stochastic [1, 2]. When stagnation occurs, it is often blamed on bad luck, with the assumption that the evolutionary algorithm has become stuck in a local minimum. As a result, significant efforts have been dedicated to designing new strategies to help existing algorithms escape such traps, or to conducting stability analysis of evolutionary algorithms to ensure convergence. This leads to the proposition that stagnation impedes convergence, and that convergence inherently signifies optimality. 1 However, after a thorough analysis of stagnation, convergence and optimality in this study, it is found that this perspective is misleading. The main contributions of this study can be summarized as follows: 1. This study is the first to highlight that the stagnation of an individual can actually facilitate the convergence of the entire population.
arXiv.org Artificial Intelligence
May-5-2025