« Back to top page

3.6.1

3.6.1

BoTorch Sampler

Class or Function Names BoTorchSampler Installation pip install optuna-integration botorch Example from optuna_integration import BoTorchSampler sampler = BoTorchSampler() Others See the documentation for more details.

Brute Force Search

Class or Function Names BruteForceSampler Example import optuna from optuna.samplers import BruteForceSampler def objective(trial): x = trial.suggest_float("x", -5, 5) return x**2 sampler = BruteForceSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=10) Others See the documentation for more details.

CatCMA Sampler

Abstract The cutting-edge evolutionary computation algorithm CatCMA has been published on OptunaHub. CatCMA is an algorithm that excels in mixed search spaces with continuous and discrete variables. This figure is from https://arxiv.org/abs/2405.09962. 📝 Introduction to CatCMA in OptunaHub: Blog post by Hideaki Imamura. Class or Function Names CatCmaSampler Installation pip install -r https://hub.optuna.org/samplers/catcma/requirements.txt Example import numpy as np import optuna from optuna.distributions import CategoricalDistribution from optuna.distributions import FloatDistribution import optunahub def objective(trial: optuna.

CMA-ES Sampler

Class or Function Names CmaEsSampler Installation pip install cmaes Example import optuna from optuna.samplers import CmaEsSampler def objective(trial): x = trial.suggest_float("x", -1, 1) y = trial.suggest_int("y", -1, 1) return x**2 + y sampler = CmaEsSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=20) Others See the documentation for more details.

Contour Plot

Class or Function Names plot_contour Example from optuna.visualization import plot_contour plot_contour(study) Others See the documentation for more details.

Demo Sampler

Class or Function Names DemoSampler Example module = optunahub.load_module("samplers/demo") sampler = module.DemoSampler(seed=42) See example.py for more details.

Differential Evolution with Hyperband (DEHB) Sampler

Class or Function Names DEHBSampler DEHBPruner Installation There is no additional installation required for this sampler and pruner, but if you want to run the example.py script, you need to install the following packages: $ pip install sklearn Example sampler = DEHBSampler() pruner = DEHBPruner(min_resource=1, max_resource=n_train_iter, reduction_factor=3) study = optuna.create_study(sampler=sampler, pruner=pruner) See example.py for a full example. The following figures are obtained from the analysis of the optimization. Others References Awad, N.

Empirical Distribution Function Plot

Class or Function Names plot_edf Example from optuna.visualization import plot_edf plot_edf(study) Others See the documentation for more details.

Ensembled Sampler

Installation No additional packages are required. Abstract This package provides a sampler that ensembles multiple samplers. You can specify the list of samplers to be ensembled. Class or Function Names EnsembledSampler Example import optuna import optunahub mod = optunahub.load_module("samplers/ensembled") samplers = [ optuna.samplers.RandomSampler(), optuna.samplers.TPESampler(), optuna.samplers.CmaEsSampler(), ] sampler = mod.EnsembledSampler(samplers) See example.py for more details.

Evolutionary LLM Merge Sampler

Class or Function Names EvoMergeSampler EvoMergeTrial Installation pip install git+https://github.com/arcee-ai/mergekit.git pip install sentencepiece accelerate protobuf bitsandbytes langchain langchain-community datasets pip install pandas cmaes export HF_TOKEN=xxx Example sampler = EvoMergeSampler(base_config="path/to/config/yml/file") study = optuna.create_study(sampler=sampler) for _ in range(100): trial = study.ask() evo_merge_trial = EvoMergeTrial(study, trial._trial_id) model = evo_merge_trial.suggest_model() acc = try_model(model) study.tell(trial, acc) print(study.trials_dataframe(attrs=("number", "value"))) See example.py for a full example. You need GPU with 16G VLAM to run this example. The following figures are obtained from the analysis of the optimization.

Gaussian Process-Based Sampler

Class or Function Names GPSampler Installation pip install scipy torch Example import optuna from optuna.samplers import GPSampler def objective(trial): x = trial.suggest_float("x", -5, 5) return x**2 sampler = GPSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=100) Others See the documentation for more details.

Gaussian-Process Probability of Improvement from Maximum of Sample Path Sampler

Class or Function Names PIMSSampler Installation $ pip install -r https://hub.optuna.org/samplers/gp_pims/requirements.txt Example Please see example.py. Others Reference Shion Takeno, Yu Inatsu, Masayuki Karasuyama, Ichiro Takeuchi, Posterior Sampling-Based Bayesian Optimization with Tighter Bayesian Regret Bounds, Proceedings of the 41st International Conference on Machine Learning, PMLR 235:47510-47534, 2024. Bibtex @InProceedings{pmlr-v235-takeno24a, title = {Posterior Sampling-Based {B}ayesian Optimization with Tighter {B}ayesian Regret Bounds}, author = {Takeno, Shion and Inatsu, Yu and Karasuyama, Masayuki and Takeuchi, Ichiro}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {47510--47534}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.

Grey Wolf Optimization (GWO) Sampler

Class or Function Names GreyWolfOptimizationSampler Example from __future__ import annotations import matplotlib.pyplot as plt import optuna import optunahub GreyWolfOptimizationSampler = optunahub.load_module( "samplers/grey_wolf_optimization" ).GreyWolfOptimizationSampler if __name__ == "__main__": def objective(trial: optuna.trial.Trial) -> float: x = trial.suggest_float("x", -10, 10) y = trial.suggest_float("y", -10, 10) return x**2 + y**2 # Note: `n_trials` should match the `n_trials` passed to `study.optimize`. sampler = GreyWolfOptimizationSampler(n_trials=100) study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=sampler.n_trials) optuna.visualization.matplotlib.plot_optimization_history(study) plt.show() Others Reference Mirjalili, S., Mirjalili, S.

Grid Search

Class or Function Names GridSampler Example import optuna from optuna.samplers import GridSampler def objective(trial): x = trial.suggest_float("x", -100, 100) y = trial.suggest_int("y", -100, 100) return x**2 + y**2 search_space = {"x": [-50, 0, 50], "y": [-99, 0, 99]} sampler = GridSampler(search_space) study = optuna.create_study(sampler=sampler) study.optimize(objective) Others See the documentation for more details.

HEBO (Heteroscedastic and Evolutionary Bayesian Optimisation)

Class or Function Names HEBOSampler Installation # Install the dependencies. pip install optunahub hebo # NOTE: Below is optional, but pymoo must be installed after NumPy for faster HEBOSampler, # we run the following command to make sure that the compiled version is installed. pip install --upgrade pymoo APIs HEBOSampler(search_space: dict[str, BaseDistribution] | None = None, *, seed: int | None = None, constant_liar: bool = False, independent_sampler: BaseSampler | None = None) search_space: By specifying search_space, the sampling speed at each iteration becomes slightly quicker, but this argument is not necessary to run this sampler.

Hyperband Pruner

Class or Function Names HyperbandPruner Example study = optuna.create_study( direction="maximize", pruner=optuna.pruners.HyperbandPruner( min_resource=1, max_resource=n_train_iter, reduction_factor=3 ), ) study.optimize(objective, n_trials=20) See example.py for a full example. Others See the documentation for more details.

Hyperparameter Importances Plot

Class or Function Names plot_param_importances Example from optuna.visualization import plot_param_importances plot_param_importances(study) Others See the documentation for more details.

Hypervolume History Plot

Class or Function Names plot_hypervolume_history Example from optuna.visualization import plot_hypervolume_history plot_hypervolume_history(study, reference_point) Others See the documentation for more details.

Implicit Natural Gradient Sampler (INGO)

Class or Function Names ImplicitNaturalGradientSampler Example import optuna import optunahub def objective(trial: optuna.Trial) -> float: x = trial.suggest_float("x", -100, 100) y = trial.suggest_float("y", -100, 100) return x**2 + y**2 def main() -> None: mod = optunahub.load_module("samplers/implicit_natural_gradient") sampler = mod.ImplicitNaturalGradientSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=200) print(study.best_trial.value, study.best_trial.params) if __name__ == "__main__": main() Others 📝 A Natural Gradient-Based Optimization Algorithm Registered on OptunaHub: Blog post by Hiroki Takizawa. In the post, benchmark results are presented as shown in the figure below.

Intermediate Values Plot

Class or Function Names plot_intermediate_values Example from optuna.visualization import plot_intermediate_values plot_intermediate_values(study) Others See the documentation for more details.

Median Pruner

Class or Function Names MedianPruner Example import optuna from optuna.pruners import MedianPruner def objective(trial): s = 0 for step in range(20): x = trial.suggest_float(f"x_{step}", -5, 5) s += x**2 trial.report(s, step) if trial.should_prune(): raise optuna.TrialPruned() return s pruner = MedianPruner() study = optuna.create_study(pruner=pruner) study.optimize(objective, n_trials=20) Others See the documentation for more details.

NelderMead Sampler

Abstract This Nelder-Mead method implemenation employs the effective initialization method proposed by Takenaga et al., 2023. Class or Function Names NelderMeadSampler Installation pip install -r https://hub.optuna.org/samplers/nelder_mead/requirements.txt Example from __future__ import annotations import optuna from optuna.distributions import BaseDistribution from optuna.distributions import FloatDistribution import optuna.study.study import optunahub def objective(x: float, y: float) -> float: return x**2 + y**2 def optuna_objective(trial: optuna.trial.Trial) -> float: x = trial.suggest_float("x", -5, 5) y = trial.suggest_float("y", -5, 5) return objective(x, y) if __name__ == "__main__": # You can specify the search space before optimization.

Nop Pruner

Class or Function Names NopPruner Example study = optuna.create_study(direction="maximize", pruner=optuna.pruners.NopPruner()) study.optimize(objective, n_trials=20) See example.py for a full example. Others See the documentation for more details.

NSGAII Search

Class or Function Names NSGAIISampler Example import optuna from optuna.samplers import NSGAIISampler def objective(trial): x = trial.suggest_float("x", -5, 5) return x**2 sampler = NSGAIISampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=10) Others See the documentation for more details.

NSGAIII Search

Class or Function Names NSGAIIISampler Example import optuna from optuna.samplers import NSGAIIISampler def objective(trial): x = trial.suggest_float("x", -5, 5) return x**2 sampler = NSGAIIISampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=10) Others See the documentation for more details.

Optimization History Plot

Class or Function Names plot_optimization_history Example from optuna.visualization import plot_optimization_history plot_optimization_history(study) Others See the documentation for more details.

Parallel Coordinate Plot

Class or Function Names plot_parallel_coordinate Example from optuna.visualization import plot_parallel_coordinate plot_parallel_coordinate(study) Others See the documentation for more details.

Pareto-front Plot

Class or Function Names plot_pareto_front Example from optuna.visualization import plot_pareto_front plot_pareto_front(study) Others See the documentation for more details.

Partial Fixed Sampler

Class or Function Names PartialFixedSampler Example import optuna from optuna.samplers import PartialFixedSampler def objective(trial): x = trial.suggest_float("x", -1, 1) y = trial.suggest_int("y", -1, 1) return x**2 + y tpe_sampler = optuna.samplers.TPESampler() fixed_params = {"y": 0} partial_sampler = PartialFixedSampler(fixed_params, tpe_sampler) study = optuna.create_study(sampler=partial_sampler) study.optimize(objective, n_trials=10) Others See the documentation for more details.

Patient Pruner

Class or Function Names PatientPruner Example study = optuna.create_study( direction="maximize", pruner=optuna.pruners.PatientPruner(optuna.pruners.MedianPruner(), patience=1), ) study.optimize(objective, n_trials=20) See example.py for a full example. Others See the documentation for more details.

Percentile Pruner

Class or Function Names PercentilePruner Example import optuna from optuna.pruners import PercentilePruner def objective(trial): s = 0 for step in range(20): x = trial.suggest_float(f"x_{step}", -5, 5) s += x**2 trial.report(s, step) if trial.should_prune(): raise optuna.TrialPruned() return s pruner = PercentilePruner(25.0) study = optuna.create_study(pruner=pruner) study.optimize(objective, n_trials=20) Others See the documentation for more details.

PFNs4BO sampler

Class or Function Names PFNs4BOSampler Installation pip install -r https://hub.optuna.org/samplers/pfns4bo/requirements.txt Example from __future__ import annotations import os import optuna import optunahub module = optunahub.load_module("samplers/pfns4bo") PFNs4BOSampler = module.PFNs4BOSampler def objective(trial: optuna.Trial) -> float: x = trial.suggest_float("x", -10, 10) return (x - 2) ** 2 if __name__ == "__main__": study = optuna.create_study( sampler=PFNs4BOSampler(), ) study.optimize(objective, n_trials=100) print(study.best_params) print(study.best_value) See example.py for a full example. You need GPU to run this example. The following figures are experimental results of the comparison between PFNs4BO and the random search.

PLMBO (Preference Learning Multi-Objective Bayesian Optimization)

Class or Function Names PLMBOSampler Installation pip install -r https://hub.optuna.org/samplers/plmbo/requirements.txt Example from __future__ import annotations import matplotlib.pyplot as plt import optuna import optunahub from optuna.distributions import FloatDistribution import numpy as np PLMBOSampler = optunahub.load_module( # type: ignore "samplers/plmbo", ).PLMBOSampler if __name__ == "__main__": f_sigma = 0.01 def obj_func1(x): return np.sin(x[0]) + x[1] def obj_func2(x): return -np.sin(x[0]) - x[1] + 0.1 def obs_obj_func(x): return np.array( [ obj_func1(x) + np.random.normal(0, f_sigma), obj_func2(x) + np.

Plot Hypervolume History for Multiple Studies

Class or Function Names plot_optimization_history Example import optuna import optunahub def objective(trial: optuna.Trial) -> tuple[float, float]: x = trial.suggest_float("x", 0, 5) y = trial.suggest_float("y", 0, 3) v0 = 4 * x**2 + 4 * y**2 v1 = (x - 5) ** 2 + (y - 5) ** 2 return v0, v1 samplers = [ optuna.samplers.RandomSampler(), optuna.samplers.TPESampler(), optuna.samplers.NSGAIISampler(), ] studies = [] for sampler in samplers: study = optuna.create_study( sampler=sampler, study_name=f"{sampler.__class__.__name__}", directions=["minimize", "minimize"], ) study.

Plot Pareto Front for Multiple Studies

Class or Function Names plot_pareto_front Example import optuna import optunahub def objective(trial: optuna.Trial) -> tuple[float, float]: x = trial.suggest_float("x", 0, 5) y = trial.suggest_float("y", 0, 3) v0 = 4 * x**2 + 4 * y**2 v1 = (x - 5) ** 2 + (y - 5) ** 2 return v0, v1 samplers = [ optuna.samplers.RandomSampler(), optuna.samplers.TPESampler(), optuna.samplers.NSGAIISampler(), ] studies = [] for sampler in samplers: study = optuna.create_study( sampler=sampler, study_name=f"{sampler.__class__.__name__}", directions=["minimize", "minimize"], ) study.

Plot the Sampling Speed Benchmark to Compare Multiple Samplers

Class or Function Names plot_sampling_speed Installation This module requires the following dependencies: matplotlib scipy Example A minimal example would be the following: from collections import defaultdict import matplotlib.pyplot as plt import optuna import optunahub def objective(trial) -> float: return trial.suggest_float("x", -5, 5)**2 studies = defaultdict(lambda: []) for i in range(3): sampler = optuna.samplers.RandomSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, timeout=1.0) studies["Random"].append(study) sampler = optuna.samplers.TPESampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, timeout=3.0) studies["TPE"].append(study) plot_sampling_speed = optunahub.

PyCMA Sampler

Class or Function Names PyCmaSampler Installation pip install optuna-integration cma Example from optuna_integration import PyCmaSampler sampler = PyCmaSampler() Others See the documentation for more details.

QMC Search

Class or Function Names QMCSampler Example import optuna from optuna.samplers import QMCSampler def objective(trial): x = trial.suggest_float("x", -5, 5) return x**2 sampler = QMCSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=10) Others See the documentation for more details.

Random Search

Class or Function Names RandomSampler Example import optuna from optuna.samplers import RandomSampler def objective(trial): x = trial.suggest_float("x", -5, 5) return x**2 sampler = RandomSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=10) Others See the documentation for more details.

Rank Plot

Class or Function Names plot_rank Example from optuna.visualization import plot_rank plot_rank(study) Others See the documentation for more details.

Sampler using Whale Optimization Algorithm

Class or Function Names WhaleOptimizationSampler Example from __future__ import annotations import matplotlib.pyplot as plt import optuna import optunahub WhaleOptimizationSampler = optunahub.load_module( "samplers/whale_optimization" ).WhaleOptimizationSampler if __name__ == "__main__": def objective(trial: optuna.trial.Trial) -> float: x = trial.suggest_float("x", -10, 10) y = trial.suggest_float("y", -10, 10) return x**2 + y**2 sampler = WhaleOptimizationSampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=100) optuna.visualization.matplotlib.plot_optimization_history(study) plt.show() Others Reference Mirjalili, Seyedali & Lewis, Andrew. (2016). The Whale Optimization Algorithm. Advances in Engineering Software.

Simple Sampler

SimpleBaseSampler has been moved to optunahub.samplers. Please use optunahub.samplers.SimpleBaseSampler instead of this package. Class or Function Names SimpleBaseSampler Example import optunahub class UserDefinedSampler( optunahub.samplers.SimpleBaseSampler ): ... See example.py for more details. Others This package provides an easy sampler base class to implement custom samplers. You can make your own sampler easily by inheriting SimpleBaseSampler and by implementing necessary methods.

Slice Plot

Class or Function Names plot_slice Example from optuna.visualization import plot_slice plot_slice(study) Others See the documentation for more details.

SMAC3

APIs A sampler that uses SMAC3 v2.2.0 verified by unittests that can be run by the following: $ pip install pytest optunahub smac $ python -m pytest package/samplers/smac_sampler/tests/ Please check the API reference for more details: https://automl.github.io/SMAC3/main/5_api.html SMACSampler(search_space: dict[str, BaseDistribution], n_trials: int = 100, seed: int | None = None, *, surrogate_model_type: str = "rf", acq_func_type: str = "ei_log", init_design_type: str = "sobol", surrogate_model_rf_num_trees: int = 10, surrogate_model_rf_ratio_features: float = 1.

Step Distribution Plot

Class or Function Names plot_step_distribution Installation You should install plotly to use this visualization. $ pip install plotly Example This plot shows how many steps (budget, epoch, iterations, etc.) were consumed before pruning occurred for each trial. fig = plot_step_distribution(study) See example.py for a full example. The following figures are obtained from the analysis of the optimization.

Successive Halving Pruner

Class or Function Names SuccessiveHalvingPruner Example study = optuna.create_study( direction="maximize", pruner=optuna.pruners.SuccessiveHalvingPruner() ) study.optimize(objective, n_trials=20) See example.py for a full example. Others See the documentation for more details.

Terminator Improvement Plot

Class or Function Names plot_terminator_improvement Example from optuna.visualization import plot_terminator_improvement plot_terminator_improvement(study) Others See the documentation for more details.

Threshold Pruner

Class or Function Names ThresholdPruner Example study = create_study(pruner=ThresholdPruner(upper=1.0)) study.optimize(objective_for_upper, n_trials=10) study = create_study(pruner=ThresholdPruner(lower=0.0)) study.optimize(objective_for_lower, n_trials=10) See example.py for a full example. Others See the documentation for more details.

Timeline Plot

Class or Function Names plot_timeline Example from optuna.visualization import plot_timeline plot_timeline(study) Others See the documentation for more details.

TPE Sampler

Class or Function Names TPESampler Example import optuna from optuna.samplers import TPESampler def objective(trial): x = trial.suggest_float("x", -10, 10) return x**2 sampler = TPESampler() study = optuna.create_study(sampler=sampler) study.optimize(objective, n_trials=10) Others See the documentation for more details.

Wilcoxon Pruner

Class or Function Names WilcoxonPruner Example study = optuna.create_study(pruner=optuna.pruners.WilcoxonPruner(p_threshold=0.1)) study.optimize(objective, n_trials=100) See example.py for a full example. Others See the documentation for more details.