Abstract CMA-ES is the gold standard for continuous black-box optimization, but it has diminishing returns: after convergence, additional CMA-ES trials provide little improvement. This sampler addresses that by splitting the trial budget into three phases:
Sobol QMC (8 trials) — quasi-random space-filling initialization CMA-ES (132 trials) — covariance matrix adaptation for main optimization Quasi-random Gaussian refinement (60 trials) — targeted local search around the best point using Sobol-based perturbation vectors with exponentially decaying scale The refinement phase uses quasi-random Sobol sequences transformed via inverse CDF to generate Gaussian-distributed perturbation vectors.
Abstract The hill climbing algorithm is an optimization technique that iteratively improves a solution by evaluating neighboring solutions in search of a local maximum or minimum. Starting with an initial guess, the algorithm examines nearby “neighbor” solutions, moving to a better neighbor if one is found. This process continues until no improvement can be made locally, at which point the algorithm may restart from a new random position.
This implementation focuses on discrete optimization problems, supporting integer and categorical parameters only.
Abstract This Nelder-Mead method implemenation employs the effective initialization method proposed by Takenaga et al., 2023.
Class or Function Names NelderMeadSampler Installation pip install -r https://hub.optuna.org/samplers/nelder_mead/requirements.txt Example from __future__ import annotations import optuna from optuna.distributions import BaseDistribution from optuna.distributions import FloatDistribution import optuna.study.study import optunahub def objective(x: float, y: float) -> float: return x**2 + y**2 def optuna_objective(trial: optuna.trial.Trial) -> float: x = trial.suggest_float("x", -5, 5) y = trial.suggest_float("y", -5, 5) return objective(x, y) if __name__ == "__main__": # You can specify the search space before optimization.