Base optimizer
neps.optimizers.base_optimizer
#
BaseOptimizer
#
BaseOptimizer(
pipeline_space: SearchSpace,
patience: int = 50,
logger: Logger | None = None,
budget: int | float | None = None,
loss_value_on_error: float | None = None,
cost_value_on_error: float | None = None,
learning_curve_on_error: (
float | list[float] | None
) = None,
ignore_errors=False,
)
Base sampler class. Implements all the low-level work.
Source code in neps/optimizers/base_optimizer.py
get_config_and_ids
abstractmethod
#
get_config_and_ids() -> tuple[SearchSpace, str, str | None]
Sample a new configuration
RETURNS | DESCRIPTION |
---|---|
config
|
serializable object representing the configuration config_id: unique identifier for the configuration previous_config_id: if provided, id of a previous on which this configuration is based
TYPE:
|
Source code in neps/optimizers/base_optimizer.py
get_cost
#
Calls result.utils.get_cost() and passes the error handling through. Please use self.get_cost() instead of get_cost() in all optimizer classes.
Source code in neps/optimizers/base_optimizer.py
get_learning_curve
#
Calls result.utils.get_loss() and passes the error handling through. Please use self.get_loss() instead of get_loss() in all optimizer classes.
Source code in neps/optimizers/base_optimizer.py
get_loss
#
Calls result.utils.get_loss() and passes the error handling through. Please use self.get_loss() instead of get_loss() in all optimizer classes.