Skip to content

Initializing the Pipeline Space#

In NePS, we need to define a pipeline_space. This space can be structured through various approaches, including a Python dictionary, a YAML file, or ConfigSpace. Each of these methods allows you to specify a set of parameter types, ranging from Float and Categorical to specialized architecture parameters. Whether you choose a dictionary, YAML file, or ConfigSpace, your selected method serves as a container or framework within which these parameters are defined and organized. This section not only guides you through the process of setting up your pipeline_space using these methods but also provides detailed instructions and examples on how to effectively incorporate various parameter types, ensuring that NePS can utilize them in the optimization process.

Parameters#

NePS currently features 4 primary hyperparameter types:

Using these types, you can define the parameters that NePS will optimize during the search process. The most basic way to pass these parameters is through a Python dictionary, where each key-value pair represents a parameter name and its respective type. For example, the following Python dictionary defines a pipeline_space with four parameters for optimizing a deep learning model:

pipeline_space = {
    "learning_rate": neps.Float(0.00001, 0.1, log=True),
    "num_epochs": neps.Integer(3, 30, is_fidelity=True),
    "optimizer": neps.Categorical(["adam", "sgd", "rmsprop"]),
    "dropout_rate": neps.Constant(0.5),
}

neps.run(.., pipeline_space = pipeline_space)
Quick Parameter Reference

neps.search_spaces.hyperparameters.categorical.Categorical #

Categorical(
    choices: Iterable[float | int | str],
    *,
    default: float | int | str | None = None,
    default_confidence: Literal[
        "low", "medium", "high"
    ] = "low"
)

Bases: ParameterWithPrior[CategoricalTypes, CategoricalTypes]

A list of unordered choices for a parameter.

This kind of Parameter is used to represent hyperparameters that can take on a discrete set of unordered values. For example, the optimizer hyperparameter in a neural network search space can be a Categorical with choices like ["adam", "sgd", "rmsprop"].

import neps

optimizer_choice = neps.Categorical(
    ["adam", "sgd", "rmsprop"],
    default="adam"
)

Please see the Parameter, ParameterWithPrior, for more details on the methods available for this class.

PARAMETER DESCRIPTION
choices

choices for the hyperparameter.

TYPE: Iterable[float | int | str]

default

default value for the hyperparameter, must be in choices= if provided.

TYPE: float | int | str | None DEFAULT: None

default_confidence

confidence score for the default value, used when condsider prior based optimization.

TYPE: Literal['low', 'medium', 'high'] DEFAULT: 'low'

Source code in neps/search_spaces/hyperparameters/categorical.py
def __init__(
    self,
    choices: Iterable[float | int | str],
    *,
    default: float | int | str | None = None,
    default_confidence: Literal["low", "medium", "high"] = "low",
):
    """Create a new `Categorical`.

    Args:
        choices: choices for the hyperparameter.
        default: default value for the hyperparameter, must be in `choices=`
            if provided.
        default_confidence: confidence score for the default value, used when
            condsider prior based optimization.
    """
    choices = list(choices)
    if len(choices) <= 1:
        raise ValueError("Categorical choices must have more than one value.")

    super().__init__(value=None, is_fidelity=False, default=default)

    for choice in choices:
        if not isinstance(choice, float | int | str):
            raise TypeError(
                f'Choice "{choice}" is not of a valid type (float, int, str)'
            )

    if not all_unique(choices):
        raise ValueError(f"Choices must be unique but got duplicates.\n{choices}")

    if default is not None and default not in choices:
        raise ValueError(
            f"Default value {default} is not in the provided choices {choices}"
        )

    self.choices = list(choices)

    # NOTE(eddiebergman): If there's ever a very large categorical,
    # then it would be beneficial to have a lookup table for indices as
    # currently we do a list.index() operation which is O(n).
    # However for small sized categoricals this is likely faster than
    # a lookup table.
    # For now we can just cache the index of the value and default.
    self._value_index: int | None = None

    self.default_confidence_choice = default_confidence
    self.default_confidence_score = self.DEFAULT_CONFIDENCE_SCORES[default_confidence]
    self.has_prior = self.default is not None
    self._default_index: int | None = (
        self.choices.index(default) if default is not None else None
    )
    self.domain = Domain.indices(len(self.choices))

value property #

value: ValueT | None

Get the value of the hyperparameter, or None if not set.

load_from #

load_from(value: Any) -> None

Load a serialized value into the hyperparameter's value.

PARAMETER DESCRIPTION
value

value to load.

TYPE: Any

Source code in neps/search_spaces/parameter.py
def load_from(self, value: Any) -> None:
    """Load a serialized value into the hyperparameter's value.

    Args:
        value: value to load.
    """
    self.set_value(value)

sample #

sample(*, user_priors: bool = False) -> Self

Sample a new version of this Parameter with a random value.

Similar to Parameter.sample(), but a ParameterWithPrior can use the confidence score by setting user_priors=True.

PARAMETER DESCRIPTION
user_priors

whether to use the confidence score when sampling a value.

TYPE: bool DEFAULT: False

RETURNS DESCRIPTION
Self

A new Parameter with a sampled value.

Source code in neps/search_spaces/parameter.py
def sample(self, *, user_priors: bool = False) -> Self:
    """Sample a new version of this `Parameter` with a random value.

    Similar to
    [`Parameter.sample()`][neps.search_spaces.Parameter.sample],
    but a `ParameterWithPrior` can use the confidence score by setting
    `user_priors=True`.

    Args:
        user_priors: whether to use the confidence score
            when sampling a value.

    Returns:
        A new `Parameter` with a sampled value.
    """
    value = self.sample_value(user_priors=user_priors)
    copy_self = self.clone()
    copy_self.set_value(value)
    return copy_self

neps.search_spaces.hyperparameters.float.Float #

Float(
    lower: Number,
    upper: Number,
    *,
    log: bool = False,
    is_fidelity: bool = False,
    default: Number | None = None,
    default_confidence: Literal[
        "low", "medium", "high"
    ] = "low"
)

Bases: Numerical[float]

A float value for a parameter.

This kind of Parameter is used to represent hyperparameters with continuous float values, optionally specifying if it exists on a log scale. For example, l2_norm could be a value in (0.1), while the learning_rate hyperparameter in a neural network search space can be a Float with a range of (0.0001, 0.1) but on a log scale.

import neps

l2_norm = neps.Float(0, 1)
learning_rate = neps.Float(1e-4, 1e-1, log=True)

Please see the [Numerical][neps.search_spaces.numerical.Numerical] class for more details on the methods available for this class.

PARAMETER DESCRIPTION
lower

lower bound for the hyperparameter.

TYPE: Number

upper

upper bound for the hyperparameter.

TYPE: Number

log

whether the hyperparameter is on a log scale.

TYPE: bool DEFAULT: False

is_fidelity

whether the hyperparameter is fidelity.

TYPE: bool DEFAULT: False

default

default value for the hyperparameter.

TYPE: Number | None DEFAULT: None

default_confidence

confidence score for the default value, used when condsidering prior based optimization..

TYPE: Literal['low', 'medium', 'high'] DEFAULT: 'low'

Source code in neps/search_spaces/hyperparameters/float.py
def __init__(
    self,
    lower: Number,
    upper: Number,
    *,
    log: bool = False,
    is_fidelity: bool = False,
    default: Number | None = None,
    default_confidence: Literal["low", "medium", "high"] = "low",
):
    """Create a new `Float`.

    Args:
        lower: lower bound for the hyperparameter.
        upper: upper bound for the hyperparameter.
        log: whether the hyperparameter is on a log scale.
        is_fidelity: whether the hyperparameter is fidelity.
        default: default value for the hyperparameter.
        default_confidence: confidence score for the default value, used when
            condsidering prior based optimization..
    """
    super().__init__(
        lower=float(lower),
        upper=float(upper),
        log=log,
        default=float(default) if default is not None else None,
        default_confidence=default_confidence,
        is_fidelity=is_fidelity,
        domain=Domain.floating(lower, upper, log=log),
    )

value property #

value: ValueT | None

Get the value of the hyperparameter, or None if not set.

load_from #

load_from(value: Any) -> None

Load a serialized value into the hyperparameter's value.

PARAMETER DESCRIPTION
value

value to load.

TYPE: Any

Source code in neps/search_spaces/parameter.py
def load_from(self, value: Any) -> None:
    """Load a serialized value into the hyperparameter's value.

    Args:
        value: value to load.
    """
    self.set_value(value)

sample #

sample(*, user_priors: bool = False) -> Self

Sample a new version of this Parameter with a random value.

Similar to Parameter.sample(), but a ParameterWithPrior can use the confidence score by setting user_priors=True.

PARAMETER DESCRIPTION
user_priors

whether to use the confidence score when sampling a value.

TYPE: bool DEFAULT: False

RETURNS DESCRIPTION
Self

A new Parameter with a sampled value.

Source code in neps/search_spaces/parameter.py
def sample(self, *, user_priors: bool = False) -> Self:
    """Sample a new version of this `Parameter` with a random value.

    Similar to
    [`Parameter.sample()`][neps.search_spaces.Parameter.sample],
    but a `ParameterWithPrior` can use the confidence score by setting
    `user_priors=True`.

    Args:
        user_priors: whether to use the confidence score
            when sampling a value.

    Returns:
        A new `Parameter` with a sampled value.
    """
    value = self.sample_value(user_priors=user_priors)
    copy_self = self.clone()
    copy_self.set_value(value)
    return copy_self

neps.search_spaces.hyperparameters.integer.Integer #

Integer(
    lower: Number,
    upper: Number,
    *,
    log: bool = False,
    is_fidelity: bool = False,
    default: Number | None = None,
    default_confidence: Literal[
        "low", "medium", "high"
    ] = "low"
)

Bases: Numerical[int]

An integer value for a parameter.

This kind of Parameter is used to represent hyperparameters with continuous integer values, optionally specifying f it exists on a log scale. For example, batch_size could be a value in (32, 128), while the num_layers hyperparameter in a neural network search space can be a Integer with a range of (1, 1000) but on a log scale.

import neps

batch_size = neps.Integer(32, 128)
num_layers = neps.Integer(1, 1000, log=True)
PARAMETER DESCRIPTION
lower

lower bound for the hyperparameter.

TYPE: Number

upper

upper bound for the hyperparameter.

TYPE: Number

log

whether the hyperparameter is on a log scale.

TYPE: bool DEFAULT: False

is_fidelity

whether the hyperparameter is fidelity.

TYPE: bool DEFAULT: False

default

default value for the hyperparameter.

TYPE: Number | None DEFAULT: None

default_confidence

confidence score for the default value, used when condsider prior based optimization.

TYPE: Literal['low', 'medium', 'high'] DEFAULT: 'low'

Source code in neps/search_spaces/hyperparameters/integer.py
def __init__(
    self,
    lower: Number,
    upper: Number,
    *,
    log: bool = False,
    is_fidelity: bool = False,
    default: Number | None = None,
    default_confidence: Literal["low", "medium", "high"] = "low",
):
    """Create a new `Integer`.

    Args:
        lower: lower bound for the hyperparameter.
        upper: upper bound for the hyperparameter.
        log: whether the hyperparameter is on a log scale.
        is_fidelity: whether the hyperparameter is fidelity.
        default: default value for the hyperparameter.
        default_confidence: confidence score for the default value, used when
            condsider prior based optimization.
    """
    lower = int(np.rint(lower))
    upper = int(np.rint(upper))
    _size = upper - lower + 1
    if _size <= 1:
        raise ValueError(
            f"Integer: expected at least 2 possible values in the range,"
            f" got upper={upper}, lower={lower}."
        )

    super().__init__(
        lower=int(np.rint(lower)),
        upper=int(np.rint(upper)),
        log=log,
        is_fidelity=is_fidelity,
        default=int(np.rint(default)) if default is not None else None,
        default_confidence=default_confidence,
        domain=Domain.integer(lower, upper, log=log),
    )

    # We subtract/add 0.499999 from lower/upper bounds respectively, such that
    # sampling in the float space gives equal probability for all integer values,
    # i.e. [x - 0.499999, x + 0.499999]
    self.float_hp = Float(
        lower=self.lower - 0.499999,
        upper=self.upper + 0.499999,
        log=self.log,
        is_fidelity=is_fidelity,
        default=default,
        default_confidence=default_confidence,
    )

value property #

value: ValueT | None

Get the value of the hyperparameter, or None if not set.

sample #

sample(*, user_priors: bool = False) -> Self

Sample a new version of this Parameter with a random value.

Similar to Parameter.sample(), but a ParameterWithPrior can use the confidence score by setting user_priors=True.

PARAMETER DESCRIPTION
user_priors

whether to use the confidence score when sampling a value.

TYPE: bool DEFAULT: False

RETURNS DESCRIPTION
Self

A new Parameter with a sampled value.

Source code in neps/search_spaces/parameter.py
def sample(self, *, user_priors: bool = False) -> Self:
    """Sample a new version of this `Parameter` with a random value.

    Similar to
    [`Parameter.sample()`][neps.search_spaces.Parameter.sample],
    but a `ParameterWithPrior` can use the confidence score by setting
    `user_priors=True`.

    Args:
        user_priors: whether to use the confidence score
            when sampling a value.

    Returns:
        A new `Parameter` with a sampled value.
    """
    value = self.sample_value(user_priors=user_priors)
    copy_self = self.clone()
    copy_self.set_value(value)
    return copy_self

neps.search_spaces.hyperparameters.constant.Constant #

Constant(value: T)

Bases: Parameter[T, T]

A constant value for a parameter.

This kind of Parameter is used to represent hyperparameters with values that should not change during optimization. For example, the batch_size hyperparameter in a neural network search space can be a Constant with a value of 32.

import neps

batch_size = neps.Constant(32)

Note

As the name suggests, the value of a Constant only have one value and so its [.default][neps.search_spaces.parameter.Parameter.default] and .value should always be the same.

This also implies that the [.default][neps.search_spaces.parameter.Parameter.default] can never be None.

Please use [.set_constant_value()][neps.search_spaces.hyperparameters.constant.Constant.set_constant_value] if you need to change the value of the constant parameter.

PARAMETER DESCRIPTION
value

value for the hyperparameter.

TYPE: T

Source code in neps/search_spaces/hyperparameters/constant.py
def __init__(self, value: T):
    """Create a new `Constant`.

    Args:
        value: value for the hyperparameter.
    """
    super().__init__(value=value, default=value, is_fidelity=False)  # type: ignore
    self._value: T = value  # type: ignore

value property #

value: T

Get the value of the constant parameter.

load_from #

load_from(value: Any) -> None

Load a serialized value into the hyperparameter's value.

PARAMETER DESCRIPTION
value

value to load.

TYPE: Any

Source code in neps/search_spaces/parameter.py
def load_from(self, value: Any) -> None:
    """Load a serialized value into the hyperparameter's value.

    Args:
        value: value to load.
    """
    self.set_value(value)

sample #

sample() -> Self

Sample a new version of this Parameter with a random value.

Will set the .value to the sampled value.

RETURNS DESCRIPTION
Self

A new Parameter with a sampled value.

Source code in neps/search_spaces/parameter.py
def sample(self) -> Self:
    """Sample a new version of this `Parameter` with a random value.

    Will set the [`.value`][neps.search_spaces.Parameter.value] to the
    sampled value.

    Returns:
        A new `Parameter` with a sampled value.
    """
    value = self.sample_value()
    copy_self = self.clone()
    copy_self.set_value(value)
    return copy_self

set_value #

set_value(value: T | None) -> None

Set the value of the constant parameter.

Note

This method is a no-op but will raise a ValueError if the value is different from the current value.

Please see [.set_constant_value()][neps.search_spaces.hyperparameters.constant.Constant.set_constant_value] which can be used to set both the .value and the [.default][neps.search_spaces.parameter.Parameter.default] at once

PARAMETER DESCRIPTION
value

value to set the parameter to.

TYPE: T | None

RAISES DESCRIPTION
ValueError

if the value is different from the current value.

Source code in neps/search_spaces/hyperparameters/constant.py
@override
def set_value(self, value: T | None) -> None:
    """Set the value of the constant parameter.

    !!! note

        This method is a no-op but will raise a `ValueError` if the value
        is different from the current value.

        Please see
        [`.set_constant_value()`][neps.search_spaces.hyperparameters.constant.Constant.set_constant_value]
        which can be used to set both the
        [`.value`][neps.search_spaces.parameter.Parameter.value]
        and the [`.default`][neps.search_spaces.parameter.Parameter.default] at once

    Args:
        value: value to set the parameter to.

    Raises:
        ValueError: if the value is different from the current value.
    """
    if value != self._value:
        raise ValueError(
            f"Constant does not allow chaning the set value. "
            f"Tried to set value to {value}, but it is already {self.value}"
        )

Using your knowledge, providing a Prior#

When optimizing, you can provide your own knowledge using the parameters default=. By indicating a default= we take this to be your user prior, your knowledge about where a good value for this parameter lies.

You can also specify a default_confidence= to indicate how strongly you want NePS, to focus on these, one of either "low", "medium", or "high".

Currently the two major algorithms that exploit this in NePS are PriorBand (prior-based HyperBand) and PiBO, a version of Bayesian Optimization which uses Priors.

import neps

neps.run(
    ...,
    pipeline_space={
        "learning_rate": neps.Float(1e-4, 1e-1, log=True, default=1e-2, default_confidence="medium"),
        "num_epochs": neps.Integer(3, 30, is_fidelity=True),
        "optimizer": neps.Categorical(["adam", "sgd", "rmsprop"], default="adam", default_confidence="low"),
        "dropout_rate": neps.Constant(0.5),
    }
)

Must set default= for all parameters, if any

If you specify default= for one parameter, you must do so for all your variables. This will be improved in future versions.

Interaction with is_fidelity

If you specify is_fidelity=True for one parameter, the default= and default_confidence= are ignored. This will be dissallowed in future versions.

Defining a pipeline space using YAML#

Create a YAML file (e.g., ./pipeline_space.yaml) with the parameter definitions following this structure.

learning_rate:
  type: float
  lower: 2e-3
  upper: 0.1
  log: true

num_epochs:
  type: int
  lower: 3
  upper: 30
  is_fidelity: true

optimizer:
  type: categorical
  choices: ["adam", "sgd", "rmsprop"]

dropout_rate: 0.5
neps.run(.., pipeline_space="./pipeline_space.yaml")

When defining the pipeline_space using a YAML file, if the type argument is not specified, the NePS will automatically infer the data type based on the value provided.

  • If lower and upper are provided, then if they are both integers, the type will be inferred as int, otherwise as float. You can provide scientific notation for floating-point numbers as well.
  • If choices are provided, the type will be inferred as categorical.
  • If just a numeric or string is provided, the type will be inferred as constant.

If none of these hold, an error will be raised.

Using ConfigSpace#

For users familiar with the ConfigSpace library, can also define the pipeline_space through ConfigurationSpace()

from configspace import ConfigurationSpace, Float

configspace = ConfigurationSpace(
    {
        "learning_rate": Float("learning_rate", bounds=(1e-4, 1e-1), log=True)
        "optimizer": ["adam", "sgd", "rmsprop"],
        "dropout_rate": 0.5,
    }
)

Warning

Parameters you wish to use as a fidelity are not support through ConfigSpace at this time.

For additional information on ConfigSpace and its features, please visit the following link.

Supported Architecture parameter Types#

A comprehensive documentation for the Architecture parameter is not available at this point.

If you are interested in exploring architecture parameters, you can find detailed examples and usage in the following resources:

  • Basic Usage Examples - Basic usage examples that can help you understand the fundamentals of Architecture parameters.
  • Experimental Examples - For more advanced and experimental use cases, including Hierarchical parameters, check out this collection of examples.

Warning

The configuration of pipeline_space from a YAML file does not currently support architecture parameter types.