Synthetic Function

An example of applying SMAC to optimize a synthetic function (2D Rosenbrock function).

We use the black-box facade because it is designed for black-box function optimization. The black-box facade uses a Gaussian Process as its surrogate model. The facade works best on a numerical hyperparameter configuration space and should not be applied to problems with large evaluation budgets (up to 1000 evaluations).

[INFO][abstract_initial_design.py:133] Using 20 initial design and 0 additional configurations.
[INFO][intensifier.py:275] No incumbent provided in the first run. Sampling a new challenger...
[INFO][intensifier.py:446] First run and no incumbent provided. Challenger is assumed to be the incumbent.
[INFO][intensifier.py:566] Updated estimated cost of incumbent on 1 trials: 1102.7878
[INFO][intensifier.py:566] Updated estimated cost of incumbent on 2 trials: 1102.7878
[INFO][intensifier.py:566] Updated estimated cost of incumbent on 3 trials: 1102.7878
[INFO][abstract_intensifier.py:340] Challenger (5.4656) is better than incumbent (1102.7878) on 3 trials.
[INFO][abstract_intensifier.py:364] Changes in incumbent:
[INFO][abstract_intensifier.py:367] --- x0: -0.9968221839517355 -> 0.03135360777378082
[INFO][abstract_intensifier.py:367] --- x1: 4.30847043171525 -> -0.21179260686039925
[INFO][abstract_intensifier.py:340] Challenger (4.9344) is better than incumbent (5.4656) on 3 trials.
[INFO][abstract_intensifier.py:364] Changes in incumbent:
[INFO][abstract_intensifier.py:367] --- x0: 0.03135360777378082 -> 3.1521441831066923
[INFO][abstract_intensifier.py:367] --- x1: -0.21179260686039925 -> 9.991027821936083
[INFO][abstract_intensifier.py:340] Challenger (3.1692) is better than incumbent (4.9344) on 3 trials.
[INFO][abstract_intensifier.py:364] Changes in incumbent:
[INFO][abstract_intensifier.py:367] --- x0: 3.1521441831066923 -> 1.4106696064172155
[INFO][abstract_intensifier.py:367] --- x1: 9.991027821936083 -> 1.8167666169669126
[INFO][abstract_intensifier.py:340] Challenger (2.144) is better than incumbent (3.1692) on 3 trials.
[INFO][abstract_intensifier.py:364] Changes in incumbent:
[INFO][abstract_intensifier.py:367] --- x0: 1.4106696064172155 -> 1.403706808083208
[INFO][abstract_intensifier.py:367] --- x1: 1.8167666169669126 -> 1.82964389327294
[INFO][abstract_intensifier.py:340] Challenger (1.5901) is better than incumbent (2.144) on 3 trials.
[INFO][abstract_intensifier.py:364] Changes in incumbent:
[INFO][abstract_intensifier.py:367] --- x0: 1.403706808083208 -> 1.395035485965658
[INFO][abstract_intensifier.py:367] --- x1: 1.82964389327294 -> 1.8263728258094574
[INFO][abstract_intensifier.py:340] Challenger (0.8351) is better than incumbent (1.5901) on 3 trials.
[INFO][abstract_intensifier.py:364] Changes in incumbent:
[INFO][abstract_intensifier.py:367] --- x0: 1.395035485965658 -> 1.7031544591929872
[INFO][abstract_intensifier.py:367] --- x1: 1.8263728258094574 -> 2.842364243857716
[INFO][base_smbo.py:260] Configuration budget is exhausted.
[INFO][abstract_facade.py:325] Final Incumbent: {'x0': 1.7031544591929872, 'x1': 2.842364243857716}
[INFO][abstract_facade.py:326] Estimated cost: 0.8351420167215506
Default cost: 16916.0
Default cost: 0.8351420167215506

from ConfigSpace import Configuration, ConfigurationSpace, Float

from smac import BlackBoxFacade, Scenario

__copyright__ = "Copyright 2021, AutoML.org Freiburg-Hannover"
__license__ = "3-clause BSD"


class Rosenbrock2D:
    @property
    def configspace(self) -> ConfigurationSpace:
        cs = ConfigurationSpace(seed=0)
        x0 = Float("x0", (-5, 10), default=-3)
        x1 = Float("x1", (-5, 10), default=-4)
        cs.add_hyperparameters([x0, x1])

        return cs

    def train(self, config: Configuration, seed: int = 0) -> float:
        """The 2-dimensional Rosenbrock function as a toy model.
        The Rosenbrock function is well-known in the optimization community and
        often serves as a toy problem. It can be defined for arbitrary
        dimensions. The minimum is always at x_i = 1 with a function value of
        zero. All input parameters are continuous. The search domain for
        all x's is the interval [-5, 10].
        """
        x1 = config["x0"]
        x2 = config["x1"]

        cost = 100.0 * (x2 - x1**2.0) ** 2.0 + (1 - x1) ** 2.0
        return cost


if __name__ == "__main__":
    model = Rosenbrock2D()

    # Scenario object specifying the optimization "environment"
    scenario = Scenario(model.configspace, name="synthetic_function", n_trials=300)

    # Now we use SMAC to find the best hyperparameters
    smac = BlackBoxFacade(
        scenario,
        model.train,  # We pass the target function here
        overwrite=True,  # Overrides any previous results that are found that are inconsistent with the meta-data.
    )

    incumbent = smac.optimize()

    # Get cost of default configuration
    default_cost = smac.validate(model.configspace.get_default_configuration())
    print(f"Default cost: {default_cost}")

    # Let's calculate the cost of the incumbent
    incumbent_cost = smac.validate(incumbent)
    print(f"Default cost: {incumbent_cost}")

Total running time of the script: ( 0 minutes 3.632 seconds)