« Back to top page

4.0.0

4.0.0

CMA-ES with User Prior

Abstract As the Optuna CMA-ES sampler does not support any flexible ways to initialize the parameters of the Gaussian distribution, so I created a workaround to do so. Class or Function Names UserPriorCmaEsSampler In principle, most arguments follow optuna.samplers.CmaEsSampler, but some parts are modified. For example, UserPriorCmaEsSampler does not support source_trials and use_separable_cma due to their incompatibility. Instead, we replaced x0 and sigma0 in CmaEsSampler with mu0 and cov0. In CmaEsSampler, we needed to provide x0 as dict and sigma0 only as float.

CMA-MAE Sampler

Abstract This package provides a sampler using CMA-MAE as implemented in pyribs. CMA-MAE is a quality diversity algorithm that has demonstrated state-of-the-art performance in a variety of domains. Pyribs is a bare-bones Python library for quality diversity optimization algorithms. For a primer on CMA-MAE, quality diversity, and pyribs, we recommend referring to the series of pyribs tutorials. For simplicity, this implementation provides a default instantiation of CMA-MAE with a GridArchive and EvolutionStrategyEmitter with improvement ranking, all wrapped up in a Scheduler.

Hill Climb Local Search Sampler

Abstract The hill climbing algorithm is an optimization technique that iteratively improves a solution by evaluating neighboring solutions in search of a local maximum or minimum. Starting with an initial guess, the algorithm examines nearby “neighbor” solutions, moving to a better neighbor if one is found. This process continues until no improvement is possible, resulting in a locally optimal solution. Hill climbing is efficient and easy to implement but can get stuck in local optima, making it suitable for simple optimization landscapes or applications with limited time constraints.

Mean Variance Analysis Scalarization Sampler)

Class or Function Names MeanVarianceAnalysisScalarizationSimulatorSampler Installation $ pip install scipy Example Please see example.ipynb Others For example, you can add sections to introduce a corresponding paper. Reference Iwazaki, Shogo, Yu Inatsu, and Ichiro Takeuchi. “Mean-variance analysis in Bayesian optimization under uncertainty.” International Conference on Artificial Intelligence and Statistics. PMLR, 2021. Bibtex @inproceedings{iwazaki2021mean, title={Mean-variance analysis in Bayesian optimization under uncertainty}, author={Iwazaki, Shogo and Inatsu, Yu and Takeuchi, Ichiro}, booktitle={International Conference on Artificial Intelligence and Statistics}, pages={973--981}, year={2021}, organization={PMLR} }

MOEA/D sampler

Abstract Sampler using MOEA/D algorithm. MOEA/D stands for “Multi-Objective Evolutionary Algorithm based on Decomposition. This sampler is specialized for multiobjective optimization. The objective function is internally decomposed into multiple single-objective subproblems to perform optimization. It may not work well with multi-threading. Check results carefully. Class or Function Names MOEADSampler Installation pip install scipy or pip install -r https://hub.optuna.org/samplers/moead/requirements.txt Example import optuna import optunahub def objective(trial: optuna.Trial) -> tuple[float, float]: x = trial.

Multi-objective CMA-ES (MO-CMA-ES) Sampler

Abstract MoCmaSampler provides the implementation of the s-MO-CMA-ES algorithm. This algorithm extends (1+1)-CMA-ES to multi-objective optimization by introducing a selection strategy based on non-domination sorting and contributing hypervolume (S-metric). It inherits important properties of CMA-ES, invariance against order-preserving transformations of the fitness function value and rotation and translation of the search space. Class or Function Names MoCmaSampler(*, search_space: dict[str, BaseDistribution] | None = None, popsize: int | None = None, seed: int | None = None) search_space: A dictionary containing the search space that defines the parameter space.

NSGAII sampler with Initial Trials

Abstract If Optuna’s built-in NSGAII has a study obtained from another sampler, but continues with that study, it cannot be used as the first generation, and optimization starts from zero. This means that even if you already know good individuals, you cannot use it in the GA. In this implementation, the already sampled results are included in the initial individuals of the GA to perform the optimization. Note, however, that this has the effect that the implementation does not necessarily support multi-threading in the generation of the initial generation.

Pyribs Visualization Wrappers

Class or Function Names plot_grid_archive_heatmap(study: optuna.Study, ax: plt.Axes, **kwargs) study: Optuna study with a sampler that uses pyribs. This function will plot the result archive from the sampler’s scheduler. ax: Axes on which to plot the heatmap. If None, we retrieve the current axes. **kwargs: All remaining kwargs will be passed to grid_archive_heatmap. Installation $ pip install ribs[visualize] Example A minimal example would be the following: import matplotlib.pyplot as plt import optuna import optunahub module = optunahub.

Sampler Using Multi-Aarmed Bandit Epsilon-Greedy Algorithm

Class or Function Names MABEpsilonGreedySampler Example mod = optunahub.load_module("samplers/mab_epsilon_greedy") sampler = mod.MABEpsilonGreedySampler() See example.py for more details. Others This package provides a sampler based on Multi-armed bandit algorithm with epsilon-greedy selection.