Goto

Collaborating Authors

Evolutionary Systems


Genetic Programming in Python: The Knapsack Problem - KDnuggets

#artificialintelligence

In this article, we will look at the knapsack problem, a classic in computer science. We will explain why it is difficult to solve using traditional computational methods, and how genetic programming can help find a "good enough" solution. Afterwards, we will look at a Python implementation of just such a solution to test out for ourselves. The knapsack problem can be used to illustrate the difficulty of solving complex computational problems. In its simplest form, one is given a knapsack of a certain capacity, a set of items with their sizes and values, and asked to maximize the value of the items placed in the knapsack without exceeding the capacity.


Oriental Journal of Computer Science and Technology

#artificialintelligence

Oriental journal of computer science and technology is the leading journal in its field. It provides an international forum for facilitating and enhancing the exchange of information among researchers involved in both the theoretical and practical aspects of computational systems drawing their inspiration from nature, with particular emphasis on evolutionary models of computation such as genetic algorithms, evolutionary strategies, classifier systems, evolutionary programming, genetic programming, and related fields such as swarm intelligence (Ant Colony Optimization and Particle Swarm Optimization), and other evolutionary computation techniques.


GECCO 2023

#artificialintelligence

The Genetic and Evolutionary Computation Conference (GECCO 2023) will present the latest high-quality results in genetic and evolutionary computation. Topics include genetic algorithms, genetic programming, ant colony optimization and swarm intelligence, complex systems, evolutionary combinatorial optimization and metaheuristics, evolutionary machine learning, evolutionary multiobjective optimization, evolutionary numerical optimization, neuroevolution, real world applications, search-based software engineering, theory, hybrids and more. The full list of tracks is available. The GECCO 2023 Program Committee invites the submission of technical papers describing your best work in genetic and evolutionary computation. Full papers of at most 8 pages (excluding references) should present original work that meets the high-quality standards of GECCO.


How Byte Pair Encoding works part2(Natural Language Processing)

#artificialintelligence

Abstract: Regular expression is important for many natural language processing tasks especially when used to deal with unstructured and semi-structured data. This work focuses on automatically generating regular expressions and proposes a novel genetic algorithm to deal with this problem. Different from the methods which generate regular expressions from character level, we first utilize byte pair encoder (BPE) to extract some frequent items, which are then used to construct regular expressions. The fitness function of our genetic algorithm contains multi objectives and is solved based on evolutionary procedure including crossover and mutation operation. In the fitness function, we take the length of generated regular expression, the maximum matching characters and samples for positive training samples, and the minimum matching characters and samples for negative training samples into consideration.


Metaheuristic optimization with the Differential Evolution algorithm

#artificialintelligence

Learn the theory of the Differential Evolution algorithm, its Python implementation and how and why it will surely help you in solving complex real-world optimization problems. This article has been written with Salvatore Guastella. Optimization is a pillar of data science. If you think about it, under the hood of each machine learning algorithms (ranging from basic linear regression to the most complex neural networks architectures), an optimization problem is solved. Moreover, in many real-world problems the goal is to find the values of one or more decision variables that minimize (or maximize) a quantity of interest while satisfying certain constraints. Few examples are given by portfolio optimization in finance, profit maximization of ad campaigns, energy efficiency in energy plants and shipment cost minimization in logistics (refer to this Medium article [1] in our Eni digiTALKS channel for an interesting example).


Automated Dynamic Algorithm Configuration

Journal of Artificial Intelligence Research

The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior art to tackle this problem; and (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.


Genetic Algorithm for Solving Optimization Problems in C++

#artificialintelligence

The genetic algorithm (GA) is a metaheuristic algorithm inspired by Charles Darwin's theory of natural selection and belongs to the class of evolutionary algorithms (EA). The algorithm was pioneered by John Holland (the 1960s and 1970s). Genetic algorithms have a number of advantages over conventional optimization techniques. GA can handle complex fitness (objective) functions that can be linear or nonlinear, continuous or discontinuous, or subject to random noise. Since the algorithm is parametrized, regarding the population size, chromosomes (here considered as the size of the subgroup) taken to elitism, crossover, and mutation require tuning.


Pinaki Laskar on LinkedIn: #algorithms #artificialintelligence #NaturalIntelligence

#artificialintelligence

What is Nature Inspired Artificial Intelligence? Many intelligence #algorithms mimic natural phenomena such as, How animals organize their lives; How they use instincts to survive; How generations evolve; How the human brain works; and How humans learn; Such nature-inspired optimization algorithms (NIOAs) are defined as a group of algorithms that are inspired by natural phenomena, including swarm intelligence, biological systems, physical and chemical systems, etc. NIOAs include bio-inspired algorithms and physics- and chemistry-based algorithms; the bio-inspired algorithms further include swarm intelligence-based and evolutionary algorithms. NIOAs are an important branch of #artificialintelligence and NIOAs have made significant progress in the last 30 years. Thus far, a large number of common NIOAs and their variants have been proposed, such as, Genetic algorithm (GA); Particle swarm optimization (PSO) algorithm; Differential evolution (DE) algorithm; Artificial bee colony (ABC) algorithm; Ant colony optimization (ACO) algorithm; Cuckoo search (CS) algorithm; Bat algorithm (BA); Firefly algorithm (FA); Immune algorithm (IA); Grey wolf optimization (GWO); Gravitational search algorithm (GSA); Harmony search (HS) algorithm; Besides, Researchers have designed many AI algorithms by imitating human intelligence with machines, going as artificial human intelligence or human AI. A model is a simplifying image of reality or a simplified abstract view of a complex reality, as with a scientific model. The idea is to apply the components of mathematical model, variables or decision parameters; constants and calibration parameters; input parameters, data; phase parameters; output parameters; noise and random parameters.


How Evolutionary Algorithms work part3

#artificialintelligence

Abstract: One of the main problems of evolutionary algorithms is the convergence of the population to local minima. In this paper, we explore techniques that can avoid this problem by encouraging a diverse behavior of the agents through a shared reward system. The rewards are randomly distributed in the environment, and the agents are only rewarded for collecting them first. This leads to an emergence of a novel behavior of the agents. We introduce our approach to the maze problem and compare it to the previously proposed solution, denoted as Novelty Search (Lehman and Stanley, 2011a).


Initialization of Feature Selection Search for Classification

Journal of Artificial Intelligence Research

Selecting the best features in a dataset improves accuracy and efficiency of classifiers  in a learning process. Datasets generally have more features than necessary, some of  them being irrelevant or redundant to others. For this reason, numerous feature selection  methods have been developed, in which different evaluation functions and measures are  applied. This paper proposes the systematic application of individual feature evaluation  methods to initialize search-based feature subset selection methods. An exhaustive review  of the starting methods used by genetic algorithms from 2014 to 2020 has been carried out.  Subsequently, an in-depth empirical study has been carried out evaluating the proposal for  different search-based feature selection methods (Sequential forward and backward selection,  Las Vegas filter and wrapper, Simulated Annealing and Genetic Algorithms). Since  the computation time is reduced and the classification accuracy with the selected features  is improved, the initialization of feature selection proposed in this work is proved to be  worth considering while designing any feature selection algorithms.