Abstract Differential Evolution (DE) Sampler This implementation introduces a novel Differential Evolution (DE) sampler, tailored to optimize both numerical and categorical hyperparameters effectively. The DE sampler integrates a hybrid approach:
Differential Evolution for Numerical Parameters: Exploiting DE’s strengths, the sampler efficiently explores numerical parameter spaces through mutation, crossover, and selection mechanisms. Random Sampling for Categorical Parameters: For categorical variables, the sampler employs random sampling, ensuring comprehensive coverage of discrete spaces. The sampler also supports dynamic search spaces, enabling seamless adaptation to varying parameter dimensions during optimization.
Abstract Large Language Models to Enhance Bayesian Optimization (LLAMBO) LLAMBO, by Liu et al., is a novel approach that integrates Large Language Models (LLMs) into the Bayesian Optimization (BO) framework to improve the optimization of complex, expensive-to-evaluate black-box functions. By leveraging the contextual understanding and few-shot learning capabilities of LLMs, LLAMBO enhances multiple facets of the BO pipeline:
Zero-Shot Warmstarting
LLAMBO frames the optimization problem in natural language, allowing the LLM to propose promising initial solutions.