smac.optimizer.acquisition module

class smac.optimizer.acquisition.AbstractAcquisitionFunction(model: smac.epm.base_epm.AbstractEPM)[source]

Bases: object

Abstract base class for acquisition function

model
logger

Constructor

Parameters

model (AbstractEPM) – Models the objective function.

_abc_impl = <_abc_data object>
abstract _compute(X: numpy.ndarray) → numpy.ndarray[source]

Computes the acquisition value for a given point X. This function has to be overwritten in a derived class.

Parameters

X (np.ndarray) – The input points where the acquisition function should be evaluated. The dimensionality of X is (N, D), with N as the number of points to evaluate at and D is the number of dimensions of one X.

Returns

Acquisition function values wrt X

Return type

np.ndarray(N,1)

update(**kwargs: Any) → None[source]

Update the acquisition function attributes required for calculation.

This method will be called after fitting the model, but before maximizing the acquisition function. As an examples, EI uses it to update the current fmin.

The default implementation only updates the attributes of the acqusition function which are already present.

Parameters

kwargs

class smac.optimizer.acquisition.EI(model: smac.epm.base_epm.AbstractEPM, par: float = 0.0)[source]

Bases: smac.optimizer.acquisition.AbstractAcquisitionFunction

Computes for a given x the expected improvement as acquisition value.

\(EI(X) := \mathbb{E}\left[ \max\{0, f(\mathbf{X^+}) - f_{t+1}(\mathbf{X}) - \xi \} \right]\), with \(f(X^+)\) as the best location.

Constructor

Parameters
  • model (AbstractEPM) –

    A model that implements at least
    • predict_marginalized_over_instances(X)

  • par (float, default=0.0) – Controls the balance between exploration and exploitation of the acquisition function.

_abc_impl = <_abc_data object>
_compute(X: numpy.ndarray) → numpy.ndarray[source]

Computes the EI value and its derivatives.

Parameters

X (np.ndarray(N, D), The input points where the acquisition function) – should be evaluated. The dimensionality of X is (N, D), with N as the number of points to evaluate at and D is the number of dimensions of one X.

Returns

Expected Improvement of X

Return type

np.ndarray(N,1)

class smac.optimizer.acquisition.EIPS(model: smac.epm.base_epm.AbstractEPM, par: float = 0.0)[source]

Bases: smac.optimizer.acquisition.EI

Computes for a given x the expected improvement as acquisition value. \(EI(X) := \frac{\mathbb{E}\left[\max\{0,f(\mathbf{X^+})-f_{t+1}(\mathbf{X})-\xi\right]\}]}{np.log(r(x))}\), with \(f(X^+)\) as the best location and \(r(x)\) as runtime.

Parameters
  • model (AbstractEPM) –

    A model that implements at least
    • predict_marginalized_over_instances(X) returning a tuples of predicted cost and running time

  • par (float, default=0.0) – Controls the balance between exploration and exploitation of the acquisition function.

_abc_impl = <_abc_data object>
_compute(X: numpy.ndarray) → numpy.ndarray[source]

Computes the EIPS value.

Parameters

X (np.ndarray(N, D), The input point where the acquisition function) – should be evaluate. The dimensionality of X is (N, D), with N as the number of points to evaluate at and D is the number of dimensions of one X.

Returns

Expected Improvement per Second of X

Return type

np.ndarray(N,1)

class smac.optimizer.acquisition.IntegratedAcquisitionFunction(model: smac.epm.base_epm.AbstractEPM, acquisition_function: smac.optimizer.acquisition.AbstractAcquisitionFunction, **kwargs: Any)[source]

Bases: smac.optimizer.acquisition.AbstractAcquisitionFunction

Marginalize over Model hyperparameters to compute the integrated acquisition function.

See “Practical Bayesian Optimization of Machine Learning Algorithms” by Jasper Snoek et al. (https://papers.nips.cc/paper/4522-practical-bayesian-optimization-of-machine-learning-algorithms.pdf) for further details.

Constructor

Parameters
  • model (AbstractEPM) – The model needs to implement an additional attribute models which contains the different models to integrate over.

  • kwargs – Additional keyword arguments

_abc_impl = <_abc_data object>
_compute(X: numpy.ndarray) → numpy.ndarray[source]

Computes the EI value and its derivatives.

Parameters

X (np.ndarray(N, D), The input points where the acquisition function) – should be evaluated. The dimensionality of X is (N, D), with N as the number of points to evaluate at and D is the number of dimensions of one X.

Returns

Expected Improvement of X

Return type

np.ndarray(N,1)

update(**kwargs: Any) → None[source]

Update the acquisition functions values.

This method will be called if the model is updated. E.g. entropy search uses it to update its approximation of P(x=x_min), EI uses it to update the current fmin.

This implementation creates an acquisition function object for each model to integrate over and sets the respective attributes for each acquisition function object.

Parameters
  • model (AbstractEPM) – The model needs to implement an additional attribute models which contains the different models to integrate over.

  • kwargs

class smac.optimizer.acquisition.LCB(model: smac.epm.base_epm.AbstractEPM, par: float = 1.0)[source]

Bases: smac.optimizer.acquisition.AbstractAcquisitionFunction

Computes the lower confidence bound for a given x over the best so far value as acquisition value.

\(LCB(X) = \mu(\mathbf{X}) - \sqrt(\beta_t)\sigma(\mathbf{X})\)

Returns -LCB(X) as the acquisition_function optimizer maximizes the acquisition value.

Parameters
  • model (AbstractEPM) –

    A model that implements at least
    • predict_marginalized_over_instances(X)

  • par (float, default=0.0) – Controls the balance between exploration and exploitation of the acquisition function.

_abc_impl = <_abc_data object>
_compute(X: numpy.ndarray) → numpy.ndarray[source]

Computes the LCB value.

Parameters

X (np.ndarray(N, D)) – Points to evaluate LCB. N is the number of points and D the dimension for the points

Returns

Expected Improvement of X

Return type

np.ndarray(N,1)

class smac.optimizer.acquisition.LogEI(model: smac.epm.base_epm.AbstractEPM, par: float = 0.0)[source]

Bases: smac.optimizer.acquisition.AbstractAcquisitionFunction

Computes for a given x the logarithm expected improvement as acquisition value.

Parameters
  • model (AbstractEPM) –

    A model that implements at least
    • predict_marginalized_over_instances(X)

  • par (float, default=0.0) – Controls the balance between exploration and exploitation of the acquisition function.

_abc_impl = <_abc_data object>
_compute(X: numpy.ndarray) → numpy.ndarray[source]

Computes the EI value and its derivatives.

Parameters

X (np.ndarray(N, D), The input points where the acquisition function) – should be evaluated. The dimensionality of X is (N, D), with N as the number of points to evaluate at and D is the number of dimensions of one X.

Returns

Expected Improvement of X

Return type

np.ndarray(N,1)

class smac.optimizer.acquisition.PI(model: smac.epm.base_epm.AbstractEPM, par: float = 0.0)[source]

Bases: smac.optimizer.acquisition.AbstractAcquisitionFunction

Computes the probability of improvement for a given x over the best so far value as acquisition value.

\(P(f_{t+1}(\mathbf{X})\geq f(\mathbf{X^+})) := \Phi(\frac{\mu(\mathbf{X}) - f(\mathbf{X^+})}{\sigma(\mathbf{X})})\), with \(f(X^+)\) as the best location and \(\Phi\) the cdf of the standard normal

Parameters
  • model (AbstractEPM) –

    A model that implements at least
    • predict_marginalized_over_instances(X)

  • par (float, default=0.0) – Controls the balance between exploration and exploitation of the acquisition function.

_abc_impl = <_abc_data object>
_compute(X: numpy.ndarray) → numpy.ndarray[source]

Computes the PI value.

Parameters

X (np.ndarray(N, D)) – Points to evaluate PI. N is the number of points and D the dimension for the points

Returns

Expected Improvement of X

Return type

np.ndarray(N,1)