smac.facade.multi_fidelity_facade

Classes

MultiFidelityFacade(scenario, target_function, *)

This facade configures SMAC in a multi-fidelity setting.

Interfaces

class smac.facade.multi_fidelity_facade.MultiFidelityFacade(scenario, target_function, *, model=None, acquisition_function=None, acquisition_maximizer=None, initial_design=None, random_design=None, intensifier=None, multi_objective_algorithm=None, runhistory_encoder=None, logging_level=None, callbacks=[], overwrite=False)[source]

Bases: HyperparameterOptimizationFacade

This facade configures SMAC in a multi-fidelity setting. The way this facade combines the components is the following and exploits fidelity information in the following form:

  1. The initial design is a RandomInitialDesign.

2. The intensification is Hyperband. The configurations from the initial design are presented as challengers and executed in the Hyperband fashion. 3. The model is a RandomForest surrogate model. The data to train it is collected by SMBO._collect_data.

Notably, the method searches through the runhistory and collects the data from the highest fidelity level, that supports at least SMBO._min_samples_model number of configurations.

  1. The acquisition function is EI, presenting the value of a candidate configuration.

  2. The acquisition optimizer is LocalAndSortedRandomSearch. It optimizes the acquisition

function to present the best configuration as challenger to the intensifier. From now on 2. works as follows: The intensifier runs the challenger in a Hyperband fashion against the existing configurations and their observed performances until the challenger does not survive a fidelity level. The intensifier can inquire about a known configuration on a yet unseen fidelity if necessary.

The loop 2-5 continues until a termination criterion is reached.

Note

For intensification the data acquisition and aggregation strategy in step 2 is changed. Incumbents are updated by the mean performance over the intersection of instances, that the challenger and incumbent have in common (abstract_intensifier._compare_configs). The model in step 3 is trained on all the available instance performance values. The datapoints for a hyperparameter configuration are disambiguated by the instance features or an index as replacement if no instance features are available.

static get_initial_design(scenario, *, n_configs=None, n_configs_per_hyperparamter=10, max_ratio=0.1, additional_configs=[])[source]

Returns a random initial design.

Parameters:
  • scenario (Scenario) –

  • n_configs (int | None, defaults to None) – Number of initial configurations (disables the arguments n_configs_per_hyperparameter).

  • n_configs_per_hyperparameter (int, defaults to 10) – Number of initial configurations per hyperparameter. For example, if my configuration space covers five hyperparameters and n_configs_per_hyperparameter is set to 10, then 50 initial configurations will be samples.

  • max_ratio (float, defaults to 0.1) – Use at most scenario.n_trials * max_ratio number of configurations in the initial design. Additional configurations are not affected by this parameter.

  • additional_configs (list[Configuration], defaults to []) – Adds additional configurations to the initial design.

Return type:

RandomInitialDesign

static get_intensifier(scenario, *, eta=3, min_challenger=1, n_seeds=1)[source]

Returns a Hyperband intensifier instance. That means that budgets are supported.

min_challengerint, defaults to 1

Minimal number of challengers to be considered (even if time_bound is exhausted earlier).

etafloat, defaults to 3

The “halving” factor after each iteration in a Successive Halving run.

n_seedsint | None, defaults to None

The number of seeds to use if the target function is non-deterministic.

Return type:

Hyperband