Title: Large Language Models to Enhance Bayesian Optimization
Speakers: Tennison Liu (https://tennisonliu.github.io/) and Nicolas Astorga (https://scholar.google.com/citations?user=oLiBK8cAAAAJ&hl=es)
Paper: https://openreview.net/forum?id=OOxotBmGol
Abstract:
Bayesian optimization (BO) is a powerful approach for optimizing complex and expensive-to-evaluate black-box functions. In this talk, we present LLAMBO, a novel approach that integrates the capabilities of Large Language Models (LLM) within BO. At a high level, we frame the BO problem in natural language, enabling LLMs to iteratively propose and evaluate promising solutions conditioned on historical evaluations. Our findings illustrate that LLAMBO is effective at zero-shot warmstarting, and enhances surrogate modeling and candidate sampling, especially in the early stages of search when observations are sparse. We empirically validate LLAMBO’s efficacy on the problem of hyperparameter tuning, highlighting strong empirical performance across a range of diverse benchmarks, proprietary, and synthetic tasks.