DS1 spectrogram: LB-MCTS: Synergizing Large Language Models and Bayesian Optimization for Efficient CASH

LB-MCTS: Synergizing Large Language Models and Bayesian Optimization for Efficient CASH

January 18, 20262601.12355v1

Authors

Beicheng Xu,Weitong Qian,Lingching Tung,Yupeng Lu,Bin Cui

Abstract

To lower the expertise barrier in machine learning, the AutoML community has focused on the CASH problem, a fundamental challenge that automates the process of algorithm selection and hyperparameter tuning. While traditional methods like Bayesian Optimization (BO) struggle with cold-start issues, Large Language Models (LLMs) can mitigate these via semantic priors.

However, existing LLM-based optimizers generalize poorly to the high-dimensional, structured CASH space. We propose LB-MCTS, a framework synergizing LLMs and BO within a Monte Carlo Tree Search structure.

It maximizes LLM reasoning with Selective Tuning Memory (STM) and explicit exploration-exploitation trade-off. It combines the strengths of both paradigms by dynamically shifting from LLM-driven to BO-driven proposals as data accumulates.

Experiments on 104 AMLB datasets demonstrate the superiority of LB-MCTS over the competitive baselines.

Resources

Stay in the loop

Get tldr.takara.ai to Your Email, Everyday.

tldr.takara.aiHome·Daily at 6am UTC·© 2026 takara.ai Ltd

Content is sourced from third-party publications.