Goto

Collaborating Authors

 muzero





Accelerating Monte Carlo Tree Search with Probability Tree State Abstraction

Neural Information Processing Systems

Monte Carlo Tree Search (MCTS) algorithms such as AlphaGo and MuZero have achieved superhuman performance in many challenging tasks. However, the computational complexity of MCTS-based algorithms is influenced by the size of the search space. To address this issue, we propose a novel probability tree state abstraction (PTSA) algorithm to improve the search efficiency of MCTS. A general tree state abstraction with path transitivity is defined. In addition, the probability tree state abstraction is proposed for fewer mistakes during the aggregation step. Furthermore, the theoretical guarantees of the transitivity and aggregation error bound are justified. To evaluate the effectiveness of the PTSA algorithm, we integrate it with state-of-the-art MCTS-based algorithms, such as Sampled MuZero and Gumbel MuZero. Experimental results on different tasks demonstrate that our method can accelerate the training process of state-of-the-art algorithms with 10%-45% search space reduction.


We thank all reviewers for their time and useful feedback

Neural Information Processing Systems

We thank all reviewers for their time and useful feedback. We apologize for a typo in the figure. There was a late notation change that wasn't reflected in the figure during edits. This is a great point. Different correction schemes are possible, but we wanted to keep the approach simple; we'll add a note in IMP ALA corrects the value targets with VTrace.


Proper Value Equivalence

Neural Information Processing Systems

VE distinguishes models based on a set of policies and a set of functions: a model is said to be VE to the environment if the Bellman operators it induces for the policies yield the correct result when applied to the functions.