Ei
neps.optimizers.bayesian_optimization.acquisition_functions.ei
#
ComprehensiveExpectedImprovement
#
ComprehensiveExpectedImprovement(
augmented_ei: bool = False,
xi: float = 0.0,
in_fill: str = "best",
log_ei: bool = False,
optimize_on_max_fidelity: bool = True,
)
Bases: BaseAcquisition
-
The input x2 is a networkx graph instead of a vectorial input
-
The search space (a collection of x1_graphs) is discrete, so there is no gradient-based optimisation. Instead, we compute the EI at all candidate points and empirically select the best position during optimisation
PARAMETER | DESCRIPTION |
---|---|
augmented_ei |
Using the Augmented EI heuristic modification to the standard expected improvement algorithm according to Huang (2006).
TYPE:
|
xi |
manual exploration-exploitation trade-off parameter.
TYPE:
|
in_fill |
the criterion to be used for in-fill for the determination of mu_star 'best' means the empirical best observation so far (but could be susceptible to noise), 'posterior' means the best posterior GP mean encountered so far, and is recommended for optimization of more noisy functions. Defaults to "best".
TYPE:
|
log_ei |
log-EI if true otherwise usual EI.
TYPE:
|
Source code in neps/optimizers/bayesian_optimization/acquisition_functions/ei.py
eval
#
Return the negative expected improvement at the query point x2