Ask-and-Tell

This examples show how to use the Ask-and-Tell interface.

[INFO][abstract_initial_design.py:134] Using 20 initial design configurations and 0 additional configurations.
[INFO][abstract_intensifier.py:493] Added config 98ccec as new incumbent because there are no incumbents yet.
[INFO][abstract_intensifier.py:565] Added config 94dccd and rejected config 98ccec as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: -0.9968221839517355 -> 0.03135360777378082
[INFO][configspace.py:175] --- x1: 4.30847043171525 -> -0.21179260686039925
[INFO][abstract_intensifier.py:565] Added config 94b2fe and rejected config 94dccd as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 0.03135360777378082 -> 1.8422549832482282
[INFO][configspace.py:175] --- x1: -0.21179260686039925 -> 3.526509233029728
[INFO][abstract_intensifier.py:565] Added config 026b73 and rejected config 94b2fe as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 1.8422549832482282 -> 0.10203668300079549
[INFO][configspace.py:175] --- x1: 3.526509233029728 -> 0.1151862278387501
[INFO][abstract_intensifier.py:565] Added config 5db1de and rejected config 026b73 as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 0.10203668300079549 -> 0.10138875906166067
[INFO][configspace.py:175] --- x1: 0.1151862278387501 -> 0.10325090860241648
[INFO][abstract_intensifier.py:565] Added config c89070 and rejected config 5db1de as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 0.10138875906166067 -> 0.08653325483166885
[INFO][configspace.py:175] --- x1: 0.10325090860241648 -> 0.08549815086526724
[INFO][smbo.py:298] Finished 50 trials.
[INFO][abstract_intensifier.py:565] Added config f3174e and rejected config c89070 as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 0.08653325483166885 -> 0.05367820946404134
[INFO][configspace.py:175] --- x1: 0.08549815086526724 -> 0.07009655254860192
[INFO][abstract_intensifier.py:565] Added config 458538 and rejected config f3174e as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 0.05367820946404134 -> 0.08292029094844988
[INFO][configspace.py:175] --- x1: 0.07009655254860192 -> 0.06495875484764024
[INFO][abstract_intensifier.py:565] Added config d95598 and rejected config 458538 as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 0.08292029094844988 -> 0.08393076270018529
[INFO][configspace.py:175] --- x1: 0.06495875484764024 -> 0.034451435520161944
[INFO][abstract_intensifier.py:565] Added config 2b7962 and rejected config d95598 as incumbent because it is not better than the incumbents on 1 instances:
[INFO][configspace.py:175] --- x0: 0.08393076270018529 -> 0.07805927091208886
[INFO][configspace.py:175] --- x1: 0.034451435520161944 -> 0.0059357711510132916
[INFO][smbo.py:298] Finished 100 trials.
[INFO][smbo.py:306] Configuration budget is exhausted:
[INFO][smbo.py:307] --- Remaining wallclock time: inf
[INFO][smbo.py:308] --- Remaining cpu time: inf
[INFO][smbo.py:309] --- Remaining trials: 0
Default cost: 16916.0
Incumbent cost: 0.8499771879028607

from ConfigSpace import Configuration, ConfigurationSpace, Float

from smac import HyperparameterOptimizationFacade, Scenario
from smac.runhistory.dataclasses import TrialValue

__copyright__ = "Copyright 2021, AutoML.org Freiburg-Hannover"
__license__ = "3-clause BSD"


class Rosenbrock2D:
    @property
    def configspace(self) -> ConfigurationSpace:
        cs = ConfigurationSpace(seed=0)
        x0 = Float("x0", (-5, 10), default=-3)
        x1 = Float("x1", (-5, 10), default=-4)
        cs.add_hyperparameters([x0, x1])

        return cs

    def train(self, config: Configuration, seed: int = 0) -> float:
        """The 2-dimensional Rosenbrock function as a toy model.
        The Rosenbrock function is well know in the optimization community and
        often serves as a toy problem. It can be defined for arbitrary
        dimensions. The minimium is always at x_i = 1 with a function value of
        zero. All input parameters are continuous. The search domain for
        all x's is the interval [-5, 10].
        """
        x1 = config["x0"]
        x2 = config["x1"]

        cost = 100.0 * (x2 - x1**2.0) ** 2.0 + (1 - x1) ** 2.0
        return cost


if __name__ == "__main__":
    model = Rosenbrock2D()

    # Scenario object
    scenario = Scenario(model.configspace, deterministic=False, n_trials=100)

    intensifier = HyperparameterOptimizationFacade.get_intensifier(
        scenario,
        max_config_calls=1,  # We basically use one seed per config only
    )

    # Now we use SMAC to find the best hyperparameters
    smac = HyperparameterOptimizationFacade(
        scenario,
        model.train,
        intensifier=intensifier,
        overwrite=True,
    )

    # We can ask SMAC which trials should be evaluated next
    for _ in range(10):
        info = smac.ask()
        assert info.seed is not None

        cost = model.train(info.config, seed=info.seed)
        value = TrialValue(cost=cost, time=0.5)

        smac.tell(info, value)

    # After calling ask+tell, we can still optimize
    # Note: SMAC will optimize the next 90 trials because 10 trials already have been evaluated
    incumbent = smac.optimize()

    # Get cost of default configuration
    default_cost = smac.validate(model.configspace.get_default_configuration())
    print(f"Default cost: {default_cost}")

    # Let's calculate the cost of the incumbent
    incumbent_cost = smac.validate(incumbent)
    print(f"Incumbent cost: {incumbent_cost}")

Total running time of the script: ( 0 minutes 3.657 seconds)