Differential evolution
smac.acquisition.maximizer.differential_evolution
#
DifferentialEvolution
#
DifferentialEvolution(
configspace: ConfigurationSpace,
acquisition_function: (
AbstractAcquisitionFunction | None
) = None,
max_iter: int = 1000,
challengers: int = 50000,
strategy: str = "best1bin",
polish: bool = True,
mutation: tuple[float, float] = (0.5, 1.0),
recombination: float = 0.7,
seed: int = 0,
)
Bases: AbstractAcquisitionMaximizer
Get candidate solutions via DifferentialEvolutionSolvers
from scipy.
According to scipy 1.9.2 documentation:
'Finds the global minimum of a multivariate function. Differential Evolution is stochastic in nature (does not use gradient methods) to find the minimum, and can search large areas of candidate space, but often requires larger numbers of function evaluations than conventional gradient-based techniques. The algorithm is due to Storn and Price [1].'
[1] Storn, R and Price, K, Differential Evolution - a Simple and Efficient Heuristic for Global Optimization over Continuous Spaces, Journal of Global Optimization, 1997, 11, 341 - 359.
Parameters#
configspace : ConfigurationSpace acquisition_function : AbstractAcquisitionFunction challengers : int, defaults to 50000 Number of challengers. max_iter: int | None, defaults to None Maximum number of iterations that the DE will perform. strategy: str, defaults to "best1bin" The strategy to use for the DE. polish: bool, defaults to True Whether to polish the final solution using L-BFGS-B. mutation: tuple[float, float], defaults to (0.5, 1.0) The mutation constant. recombination: float, defaults to 0.7 The recombination constant. seed : int, defaults to 0
Source code in smac/acquisition/maximizer/differential_evolution.py
acquisition_function
property
writable
#
acquisition_function: AbstractAcquisitionFunction | None
The acquisition function used for maximization.
maximize
#
maximize(
previous_configs: list[Configuration],
n_points: int | None = None,
random_design: AbstractRandomDesign | None = None,
) -> Iterator[Configuration]
Maximize acquisition function using _maximize
, implemented by a subclass.
Parameters#
previous_configs: list[Configuration]
Previous evaluated configurations.
n_points: int, defaults to None
Number of points to be sampled & number of configurations to be returned. If n_points
is not specified,
self._challengers
is used. Semantics depend on concrete implementation.
random_design: AbstractRandomDesign, defaults to None
Part of the returned ChallengerList such that we can interleave random configurations
by a scheme defined by the random design. The method random_design.next_iteration()
is called at the end of this function.
Returns#
challengers : Iterator[Configuration] An iterable consisting of configurations.
Source code in smac/acquisition/maximizer/abstract_acquisition_maximizer.py
check_kwarg
#
Checks if a given class accepts a specific keyword argument in its init method.
Parameters#
cls (type): The class to inspect.
kwarg_name (str): The name of the keyword argument to check.
Returns#
bool: True if the class's __init__ method accepts the keyword argument,
otherwise False.