Threshold UCT: Cost-Constrained Monte Carlo Tree Search with Pareto Curves

Investor logo

Warning

This publication doesn't include Institute of Computer Science. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

KUREČKA Martin NEVYHOŠTĚNÝ Václav NOVOTNÝ Petr UNČOVSKÝ Vít

Year of publication 2025
Type Article in Proceedings
Conference Proceedings of the Thirty-Ninth AAAI Conference on Artificial Intelligence and Thirty-Seventh Conference on Innovative Applications of Artificial Intelligence and Fifteenth Symposium on Educational Advances in Artificial Intelligence
MU Faculty or unit

Faculty of Informatics

Citation
web https://doi.org/10.1609/aaai.v39i25.34858
Doi https://doi.org/10.1609/aaai.v39i25.34858
Keywords MCTS;cost constraints;Pareto curves;heuristic search
Description Constrained Markov decision processes (CMDPs), in which the agent optimizes expected payoffs while keeping the expected cost below a given threshold, are the leading framework for safe sequential decision making under stochastic uncertainty. Among algorithms for planning and learning in CMDPs, methods based on Monte Carlo tree search (MCTS) have particular importance due to their efficiency and extendibility to more complex frameworks (such as partially observable settings and games). However, current MCTS-based methods for CMDPs either struggle with finding safe (i.e., constraint-satisfying) policies, or are too conservative and do not find valuable policies. We introduce Threshold UCT (T-UCT), an online MCTS-based algorithm for CMDP planning. Unlike previous MCTS-based CMDP planners, T-UCT explicitly estimates Pareto curves of cost-utility trade-offs throughout the search tree, using these together with a novel action selection and threshold update rules to seek safe and valuable policies. Our experiments demonstrate that our approach significantly outperforms state-of-the-art methods from the literature.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.

More info