Skip to content

Differential evolution

smac.acquisition.maximizer.differential_evolution #

DifferentialEvolution #

DifferentialEvolution(
    configspace: ConfigurationSpace,
    acquisition_function: (
        AbstractAcquisitionFunction | None
    ) = None,
    max_iter: int = 1000,
    challengers: int = 50000,
    strategy: str = "best1bin",
    polish: bool = True,
    mutation: tuple[float, float] = (0.5, 1.0),
    recombination: float = 0.7,
    seed: int = 0,
)

Bases: AbstractAcquisitionMaximizer

Get candidate solutions via DifferentialEvolutionSolvers from scipy.

According to scipy 1.9.2 documentation:

'Finds the global minimum of a multivariate function. Differential Evolution is stochastic in nature (does not use gradient methods) to find the minimum, and can search large areas of candidate space, but often requires larger numbers of function evaluations than conventional gradient-based techniques. The algorithm is due to Storn and Price [1].'

[1] Storn, R and Price, K, Differential Evolution - a Simple and Efficient Heuristic for Global Optimization over Continuous Spaces, Journal of Global Optimization, 1997, 11, 341 - 359.

Parameters#

configspace : ConfigurationSpace acquisition_function : AbstractAcquisitionFunction challengers : int, defaults to 50000 Number of challengers. max_iter: int | None, defaults to None Maximum number of iterations that the DE will perform. strategy: str, defaults to "best1bin" The strategy to use for the DE. polish: bool, defaults to True Whether to polish the final solution using L-BFGS-B. mutation: tuple[float, float], defaults to (0.5, 1.0) The mutation constant. recombination: float, defaults to 0.7 The recombination constant. seed : int, defaults to 0

Source code in smac/acquisition/maximizer/differential_evolution.py
def __init__(
    self,
    configspace: ConfigurationSpace,
    acquisition_function: AbstractAcquisitionFunction | None = None,
    max_iter: int = 1000,
    challengers: int = 50000,
    strategy: str = "best1bin",
    polish: bool = True,
    mutation: tuple[float, float] = (0.5, 1.0),
    recombination: float = 0.7,
    seed: int = 0,
):
    super().__init__(configspace, acquisition_function, challengers, seed)
    # raise NotImplementedError("DifferentialEvolution is not yet implemented.")
    self.max_iter = max_iter
    self.strategy = strategy
    self.polish = polish
    self.mutation = mutation
    self.recombination = recombination

acquisition_function property writable #

acquisition_function: AbstractAcquisitionFunction | None

The acquisition function used for maximization.

meta property #

meta: dict[str, Any]

Return the meta-data of the created object.

maximize #

maximize(
    previous_configs: list[Configuration],
    n_points: int | None = None,
    random_design: AbstractRandomDesign | None = None,
) -> Iterator[Configuration]

Maximize acquisition function using _maximize, implemented by a subclass.

Parameters#

previous_configs: list[Configuration] Previous evaluated configurations. n_points: int, defaults to None Number of points to be sampled & number of configurations to be returned. If n_points is not specified, self._challengers is used. Semantics depend on concrete implementation. random_design: AbstractRandomDesign, defaults to None Part of the returned ChallengerList such that we can interleave random configurations by a scheme defined by the random design. The method random_design.next_iteration() is called at the end of this function.

Returns#

challengers : Iterator[Configuration] An iterable consisting of configurations.

Source code in smac/acquisition/maximizer/abstract_acquisition_maximizer.py
def maximize(
    self,
    previous_configs: list[Configuration],
    n_points: int | None = None,
    random_design: AbstractRandomDesign | None = None,
) -> Iterator[Configuration]:
    """Maximize acquisition function using `_maximize`, implemented by a subclass.

    Parameters
    ----------
    previous_configs: list[Configuration]
        Previous evaluated configurations.
    n_points: int, defaults to None
        Number of points to be sampled & number of configurations to be returned. If `n_points` is not specified,
        `self._challengers` is used. Semantics depend on concrete implementation.
    random_design: AbstractRandomDesign, defaults to None
        Part of the returned ChallengerList such that we can interleave random configurations
        by a scheme defined by the random design. The method `random_design.next_iteration()`
        is called at the end of this function.

    Returns
    -------
    challengers : Iterator[Configuration]
        An iterable consisting of configurations.
    """
    if n_points is None:
        n_points = self._challengers

    def next_configs_by_acquisition_value() -> list[Configuration]:
        assert n_points is not None
        # since maximize returns a tuple of acquisition value and configuration,
        # and we only need the configuration, we return the second element of the tuple
        # for each element in the list
        return [t[1] for t in self._maximize(previous_configs, n_points)]

    challengers = ChallengerList(
        self._configspace,
        next_configs_by_acquisition_value,
        random_design,
    )

    if random_design is not None:
        random_design.next_iteration()

    return challengers

check_kwarg #

check_kwarg(cls: type, kwarg_name: str) -> bool

Checks if a given class accepts a specific keyword argument in its init method.

Parameters#
cls (type): The class to inspect.
kwarg_name (str): The name of the keyword argument to check.
Returns#
bool: True if the class's __init__ method accepts the keyword argument,
      otherwise False.
Source code in smac/acquisition/maximizer/differential_evolution.py
def check_kwarg(cls: type, kwarg_name: str) -> bool:
    """
    Checks if a given class accepts a specific keyword argument in its __init__ method.

    Parameters
    ----------
        cls (type): The class to inspect.
        kwarg_name (str): The name of the keyword argument to check.

    Returns
    -------
        bool: True if the class's __init__ method accepts the keyword argument,
              otherwise False.
    """
    # Get the signature of the class's __init__ method
    init_signature = inspect.signature(cls.__init__)  # type: ignore[misc]

    # Check if the kwarg_name is present in the signature as a parameter
    for param in init_signature.parameters.values():
        if param.name == kwarg_name and param.default != inspect.Parameter.empty:
            return True  # It accepts the kwarg
    return False  # It does not accept the kwarg