smac.optimizer.configuration_chooser.boing_chooser¶
Functions
|
Extract a subspace that contains at least num_min points but no more than num_max points |
Classes
|
Interface to train the EPM and generate next configurations with both global and local models. |
- class smac.optimizer.configuration_chooser.boing_chooser.BOinGChooser(scenario, stats, runhistory, runhistory2epm, model, acq_optimizer, acquisition_func, rng, restore_incumbent=None, random_configuration_chooser=<smac.optimizer.configuration_chooser.random_chooser.ChooserNoCoolDown object>, predict_x_best=True, min_samples_model=1, model_local=<class 'smac.epm.gaussian_process.augmented.GloballyAugmentedLocalGaussianProcess'>, acquisition_func_local=<class 'smac.optimizer.acquisition.EI'>, model_local_kwargs=None, acquisition_func_local_kwargs=None, acq_optimizer_local=None, acq_optimizer_local_kwargs=None, max_configs_local_fracs=0.5, min_configs_local=None, do_switching=False, turbo_kwargs=None)[source]¶
Bases:
smac.optimizer.configuration_chooser.epm_chooser.EPMChooser
Interface to train the EPM and generate next configurations with both global and local models.
- Parameters
runhistory2epm (RunHistory2EPM4CostWithRaw,) – a transformer to transform rh to vectors, different from the rh2epm used in vanilla EPMChooser, this rh2epm object needs to provide the raw values for optimizer in different stages
model (smac.epm.rf_with_instances.RandomForestWithInstances) – empirical performance model (right now, we support only RandomForestWithInstances) as a global model
acq_optimizer (smac.optimizer.ei_optimization.AcquisitionFunctionMaximizer) – Optimizer of acquisition function of global models
model_local (BaseEPM,) – local empirical performance model, used in subspace
model_local_kwargs (Optional[Dict] = None,) – parameters for initializing a local model
acquisition_func_local (AbstractAcquisitionFunction,) – local acquisition function, used in subspace
acquisition_func_local_kwargs (Optional[Dict] = None,) – parameters for initializing a local acquisition function optimizer
acq_optimizer_local (Optional[AcquisitionFunctionMaximizer] = None,) – Optimizer of acquisition function of local models
acq_optimizer_local_kwargs (typing: Optional[Dict] = None,) – parameters for the optimizer of acquisition function of local models
max_configs_local_fracs (float) – The maximal number of fractions of samples to be included in the subspace. If the number of samples in the subspace is greater than this value and n_min_config_inner, the subspace will be cropped to fit the requirement
min_configs_local (int,) – Minimum number of samples included in the inner loop model
do_switching (bool) – if we want to switch between turbo and boing or do a pure BOinG search
turbo_kwargs (Optional[Dict] = None) – parameters for building a turbo optimizer
- choose_next(incumbent_value=None)[source]¶
- Choose next candidate solution with Bayesian optimization. We use TurBO optimizer or BOinG to suggest
the next configuration.
If we switch local model between TurBO and BOinG, we gradually increase the probability to switch to another optimizer if we cannot make further process. (Or if TurBO find a new incumbent, we will switch to BOinG to do further exploitation)
- Parameters
incumbent_value (float) – Cost value of incumbent configuration (required for acquisition function); If not given, it will be inferred from runhistory or predicted; if not given and runhistory is empty, it will raise a ValueError.
- Return type
Iterator
- restart_TuRBOinG(X, Y, Y_raw, train_model=False)[source]¶
Restart a new TurBO Optimizer, the bounds of the TurBO Optimizer is determined by a RF, we randomly sample 20 points and extract subspaces that contain at least self.min_configs_local points, and we select the subspace with the largest volume to construct a turbo optimizer :type X:
ndarray
:param X: previous evaluated configurations :type X: np.ndarray (N, D) :type Y:ndarray
:param Y: performances of previous evaluated configurations (transformed by rh2epm transformer) :type Y: np.ndarray (N,) :type Y_raw:ndarray
:param Y_raw: performances of previous evaluated configurations (raw values, not transformed) :type Y_raw: np.ndarray (N,) :type train_model:bool
:param train_model: if we retrain the model with the given X and Y :type train_model: bool- Return type
None
- smac.optimizer.configuration_chooser.boing_chooser.subspace_extraction(X, challenger, model, num_min, num_max, bounds, cat_dims, cont_dims)[source]¶
Extract a subspace that contains at least num_min points but no more than num_max points
- Parameters
X (np.ndarray (N, D)) – points used to train the model
challenger (np.ndarray (1, D)) – the challenger where the subspace would grow
model (RandomForestWithInstances) – a rf model
num_min (int) – minimal number of points to be included in the subspace
num_max (int) – maximal number of points to be included in the subspace
bounds (np.ndarray(D, 2)) – bounds of the entire space, D = D_cat + D_cont
cat_dims (np.ndarray (D_cat)) – categorical dimensions
cont_dims (np.ndarray(D_cont)) –
dimensions (continuous) –
- Return type
Tuple
[ndarray
,List
[Tuple
],ndarray
]- Returns
union_bounds_cont (np.ndarray(D_cont, 2),) – the continuous bounds of the subregion
union_bounds_cat, List[Tuple], – the categorical bounds of the subregion
in_ss_dims – indices of the points that lie inside the subregion