Synthetic Function with few Hyperparameters

An example of applying SMAC to optimize a synthetic function (2d rosenbrock function).

We use the SMAC4BB facade because it is designed for black-box function optimization. SMAC4BB uses a Gaussian Process or a set of Gaussian Processes whose hyperparameters are integrated by Markov-Chain Monte-Carlo as its surrogate model. SMAC4BB works best on numerical hyperparameter configuration space and should not be applied to the problems with large evaluation budgets (up to 1000 evaluations).

Out:

INFO:smac.utils.io.cmd_reader.CMDReader:Output to smac3-output_2022-02-17_16:02:50_399797
Default Value: 16916.00
Optimizing! Depending on your machine, this might take a few minutes.
INFO:smac.facade.smac_bb_facade.SMAC4BB:Optimizing a deterministic scenario for quality without a tuner timeout - will make SMAC deterministic and only evaluate one configuration per iteration!
INFO:smac.initial_design.sobol_design.SobolDesign:Running initial design for 2 configurations
INFO:smac.facade.smac_bb_facade.SMAC4BB:<class 'smac.facade.smac_bb_facade.SMAC4BB'>
INFO:smac.optimizer.smbo.SMBO:Running initial design
INFO:smac.intensification.intensification.Intensifier:First run, no incumbent provided; challenger is assumed to be the incumbent
INFO:smac.intensification.intensification.Intensifier:First run, no incumbent provided; challenger is assumed to be the incumbent
INFO:smac.intensification.intensification.Intensifier:Updated estimated cost of incumbent on 1 runs: 5311.5619
INFO:smac.intensification.intensification.Intensifier:Challenger (4744.681) is better than incumbent (5311.5619) on 1 runs.
INFO:smac.intensification.intensification.Intensifier:Changes in incumbent:
INFO:smac.intensification.intensification.Intensifier:  x0 : -0.21980144549161196 -> 0.666692873028599
INFO:smac.intensification.intensification.Intensifier:  x1 : 7.335338136181235 -> 7.332563252406933
INFO:smac.intensification.intensification.Intensifier:Challenger (939.3255) is better than incumbent (4744.681) on 1 runs.
INFO:smac.intensification.intensification.Intensifier:Changes in incumbent:
INFO:smac.intensification.intensification.Intensifier:  x0 : 0.666692873028599 -> 2.263660637291685
INFO:smac.intensification.intensification.Intensifier:  x1 : 7.332563252406933 -> 8.186395068270798
INFO:smac.intensification.intensification.Intensifier:Challenger (920.0862) is better than incumbent (939.3255) on 1 runs.
INFO:smac.intensification.intensification.Intensifier:Changes in incumbent:
INFO:smac.intensification.intensification.Intensifier:  x0 : 2.263660637291685 -> 2.2708123182821804
INFO:smac.intensification.intensification.Intensifier:  x1 : 8.186395068270798 -> 8.187217627538942
INFO:smac.stats.stats.Stats:---------------------STATISTICS---------------------
INFO:smac.stats.stats.Stats:Incumbent changed: 3
INFO:smac.stats.stats.Stats:Submitted target algorithm runs: 10 / 10.0
INFO:smac.stats.stats.Stats:Finished target algorithm runs: 10 / 10.0
INFO:smac.stats.stats.Stats:Configurations: 10
INFO:smac.stats.stats.Stats:Used wallclock time: 2.59 / inf sec
INFO:smac.stats.stats.Stats:Used target algorithm runtime: 0.00 / inf sec
INFO:smac.stats.stats.Stats:----------------------------------------------------
INFO:smac.facade.smac_bb_facade.SMAC4BB:Final Incumbent: Configuration(values={
  'x0': 2.2708123182821804,
  'x1': 8.187217627538942,
})

INFO:smac.facade.smac_bb_facade.SMAC4BB:Estimated cost of incumbent: 920.0862

import logging
logging.basicConfig(level=logging.INFO)

import numpy as np
from ConfigSpace.hyperparameters import UniformFloatHyperparameter

# Import ConfigSpace and different types of parameters
from smac.configspace import ConfigurationSpace
from smac.facade.smac_bb_facade import SMAC4BB
from smac.optimizer.acquisition import EI

# Import SMAC-utilities
from smac.scenario.scenario import Scenario

__copyright__ = "Copyright 2021, AutoML.org Freiburg-Hannover"
__license__ = "3-clause BSD"


def rosenbrock_2d(x):
    """ The 2 dimensional Rosenbrock function as a toy model
    The Rosenbrock function is well know in the optimization community and
    often serves as a toy problem. It can be defined for arbitrary
    dimensions. The minimium is always at x_i = 1 with a function value of
    zero. All input parameters are continuous. The search domain for
    all x's is the interval [-5, 10].
    """

    x1 = x["x0"]
    x2 = x["x1"]

    val = 100. * (x2 - x1 ** 2.) ** 2. + (1 - x1) ** 2.
    return val


if __name__ == "__main__":
    # Build Configuration Space which defines all parameters and their ranges
    cs = ConfigurationSpace()
    x0 = UniformFloatHyperparameter("x0", -5, 10, default_value=-3)
    x1 = UniformFloatHyperparameter("x1", -5, 10, default_value=-4)
    cs.add_hyperparameters([x0, x1])

    # Scenario object
    scenario = Scenario({"run_obj": "quality",  # we optimize quality (alternatively runtime)
                         "runcount-limit": 10,  # max. number of function evaluations
                         "cs": cs,  # configuration space
                         "deterministic": True
                         })

    # Use 'gp' or 'gp_mcmc' here
    model_type = 'gp'

    # Example call of the function
    # It returns: Status, Cost, Runtime, Additional Infos
    def_value = rosenbrock_2d(cs.get_default_configuration())
    print("Default Value: %.2f" % def_value)

    # Optimize, using a SMAC-object
    print("Optimizing! Depending on your machine, this might take a few minutes.")
    smac = SMAC4BB(scenario=scenario,
                   model_type=model_type,
                   rng=np.random.RandomState(42),
                   acquisition_function=EI,  # or others like PI, LCB as acquisition functions
                   tae_runner=rosenbrock_2d)

    smac.optimize()

Total running time of the script: ( 0 minutes 2.601 seconds)

Gallery generated by Sphinx-Gallery