Plotting

 Korf, Richard


1.6-Bit Pattern Databases

AAAI Conferences

We present a new technique to compress pattern databases to provide consistent heuristics without loss of information. We store the heuristic estimate modulo three, requiring only two bits per entry or in a more compact representation, only 1.6 bits. This allows us to store a pattern database with more entries in the same amount of memory as an uncompressed pattern database. These compression techniques are most useful where lossy compression using cliques or their generalization is not possible or where adjacent entries in the pattern database are not highly correlated. We compare both techniques to the best existing compression methods for the Top-Spin puzzle, Rubik's cube, the 4-peg Towers of Hanoi problem, and the 24 puzzle. Under certain conditions, our best implementations for the Top-Spin puzzle and Rubik's cube outperform the respective state of the art solvers by a factor of four.


Independent Additive Heuristics Reduce Search Multiplicatively

AAAI Conferences

This paper analyzes the performance of IDA* using additive heuristics. We show that the reduction in the number of nodes expanded using multiple independent additive heuristics is the product of the reductions achieved by the individual heuristics. First, we formally state and prove this result on unit edge-cost undirected graphs with a uniform branching factor. Then, we empirically verify it on a model of the 4-peg Towers of Hanoi problem. We also run experiments on the multiple sequence alignment problem showing more general applicability to non-unit edge-cost directed graphs. Then, we extend an existing model to predict the performance of IDA* with a single pattern database to independent additive disjoint pattern databases. This is the first analysis of the performance of independent additive heuristics.


Member's Forum

AI Magazine

We have expanded review criteria for the technical program, effectively increasing the number of ways in which a submitted paper can qualify for acceptance. Most importantly, we have revised the review procedure to encourage acceptance of a larger number and broader range of papers, as discussed below.