Bayesian Optimization. A Black-Box optimization algorithm weighing exploration & exploitation to find the minimum of its objective.
Sequential Model-Based Algorithm Configuration.
Random Online Adaptive Racing. A simple model-free instantiation of the general SMBO framework. It selects configurations uniformly random and iteratively compares them against the current incumbent using the intensification mechanism. See SMAC extended chapter 3.2 for details.
Black-Box. Refers to an algorithm being optimized, where only input & output are observable.
Multi-Fidelity. Refers to running an algorithm on multiple budgets (such as number of epochs or subsets of data) and thereby evaluating the performance prematurely.
Target Algorithm Evaluator. Your model, which returns a cost based on the given config, budget and instance.
Gaussian Process with Markov-Chain Monte-Carlo.
An objective is a metric to evaluate the quality or performance of an Algorithm.
Budget is another word for fidelity. Examples are the number of training epochs or the size of the data subset the algorithm is trained on.
ConfigurationSpace can be written/read from a PCS file.
Empirical Performance Models. Empirical performance models are regression models that characterize a given algorithm’s performance across problem instances and/or parameter settings. These models can predict the performance of algorithms on previously unseen input, including previously unseen problem instances and or previously untested parameter settings and are useful for analyzing of how an algorithm performs under different conditions, select promising configurations for a new problem instance, or surrogate benchmarks.
A mechanism, that governs how many evaluations to perform with each configuration and when to trust a configuration enough to make it the new current best known configuration (the incumbent).