Abstract As the Optuna CMA-ES sampler does not support any flexible ways to initialize the parameters of the Gaussian distribution, so I created a workaround to do so.
Class or Function Names UserPriorCmaEsSampler In principle, most arguments follow optuna.samplers.CmaEsSampler, but some parts are modified.
For example, UserPriorCmaEsSampler does not support source_trials and use_separable_cma due to their incompatibility. Instead, we replaced x0 and sigma0 in CmaEsSampler with mu0 and cov0. In CmaEsSampler, we needed to provide x0 as dict and sigma0 only as float.
Abstract This package provides a sampler using CMA-MAE as implemented in pyribs. CMA-MAE is a quality diversity algorithm that has demonstrated state-of-the-art performance in a variety of domains. Pyribs is a bare-bones Python library for quality diversity optimization algorithms. For a primer on CMA-MAE, quality diversity, and pyribs, we recommend referring to the series of pyribs tutorials.
For simplicity, this implementation provides a default instantiation of CMA-MAE with a GridArchive and EvolutionStrategyEmitter with improvement ranking, all wrapped up in a Scheduler.
Abstract The hill climbing algorithm is an optimization technique that iteratively improves a solution by evaluating neighboring solutions in search of a local maximum or minimum. Starting with an initial guess, the algorithm examines nearby “neighbor” solutions, moving to a better neighbor if one is found. This process continues until no improvement is possible, resulting in a locally optimal solution. Hill climbing is efficient and easy to implement but can get stuck in local optima, making it suitable for simple optimization landscapes or applications with limited time constraints.
Abstract The hill climbing algorithm is an optimization technique that iteratively improves a solution by evaluating neighboring solutions in search of a local maximum or minimum. Starting with an initial guess, the algorithm examines nearby “neighbor” solutions, moving to a better neighbor if one is found. This process continues until no improvement can be made locally, at which point the algorithm may restart from a new random position.
This implementation focuses on discrete optimization problems, supporting integer and categorical parameters only.
Class or Function Names MeanVarianceAnalysisScalarizationSimulatorSampler Installation $ pip install scipy Example Please see example.ipynb
Others For example, you can add sections to introduce a corresponding paper.
Reference Iwazaki, Shogo, Yu Inatsu, and Ichiro Takeuchi. “Mean-variance analysis in Bayesian optimization under uncertainty.” International Conference on Artificial Intelligence and Statistics. PMLR, 2021.
Bibtex @inproceedings{iwazaki2021mean, title={Mean-variance analysis in Bayesian optimization under uncertainty}, author={Iwazaki, Shogo and Inatsu, Yu and Takeuchi, Ichiro}, booktitle={International Conference on Artificial Intelligence and Statistics}, pages={973--981}, year={2021}, organization={PMLR} }
Abstract Sampler using MOEA/D algorithm. MOEA/D stands for “Multi-Objective Evolutionary Algorithm based on Decomposition.
This sampler is specialized for multiobjective optimization. The objective function is internally decomposed into multiple single-objective subproblems to perform optimization.
It may not work well with multi-threading. Check results carefully.
APIs MOEADSampler(*, population_size = 100, n_neighbors = None, scalar_aggregation_func = "tchebycheff", mutation = None, mutation_prob = None, crossover = None, crossover_prob = 0.9, seed = None n_neighbors: The number of the weight vectors in the neighborhood of each weight vector.
Abstract MoCmaSampler provides the implementation of the s-MO-CMA-ES algorithm. This algorithm extends (1+1)-CMA-ES to multi-objective optimization by introducing a selection strategy based on non-domination sorting and contributing hypervolume (S-metric). It inherits important properties of CMA-ES, invariance against order-preserving transformations of the fitness function value and rotation and translation of the search space.
Class or Function Names MoCmaSampler(*, search_space: dict[str, BaseDistribution] | None = None, popsize: int | None = None, seed: int | None = None) search_space: A dictionary containing the search space that defines the parameter space.
Class or Function Names plot_grid_archive_heatmap(study: optuna.Study, ax: plt.Axes, **kwargs)
study: Optuna study with a sampler that uses pyribs. This function will plot the result archive from the sampler’s scheduler. ax: Axes on which to plot the heatmap. If None, we retrieve the current axes. **kwargs: All remaining kwargs will be passed to grid_archive_heatmap. Installation $ pip install ribs[visualize] Example A minimal example would be the following:
import matplotlib.pyplot as plt import optuna import optunahub module = optunahub.
Class or Function Names MABEpsilonGreedySampler Example mod = optunahub.load_module("samplers/mab_epsilon_greedy") sampler = mod.MABEpsilonGreedySampler() See example.py for more details.
Others This package provides a sampler based on Multi-armed bandit algorithm with epsilon-greedy selection.