Initializing the Pipeline Space#
In NePS, we need to define a pipeline_space
.
This space can be structured through various approaches, including a Python dictionary, a YAML file, or ConfigSpace.
Each of these methods allows you to specify a set of parameter types, ranging from Float and Categorical to specialized architecture parameters.
Whether you choose a dictionary, YAML file, or ConfigSpace, your selected method serves as a container or framework
within which these parameters are defined and organized. This section not only guides you through the process of
setting up your pipeline_space
using these methods but also provides detailed instructions and examples on how to
effectively incorporate various parameter types, ensuring that NePS can utilize them in the optimization process.
Parameters#
NePS currently features 4 primary hyperparameter types:
Using these types, you can define the parameters that NePS will optimize during the search process.
The most basic way to pass these parameters is through a Python dictionary, where each key-value
pair represents a parameter name and its respective type.
For example, the following Python dictionary defines a pipeline_space
with four parameters
for optimizing a deep learning model:
pipeline_space = {
"learning_rate": neps.Float(0.00001, 0.1, log=True),
"num_epochs": neps.Integer(3, 30, is_fidelity=True),
"optimizer": neps.Categorical(["adam", "sgd", "rmsprop"]),
"dropout_rate": neps.Constant(0.5),
}
neps.run(.., pipeline_space = pipeline_space)
Quick Parameter Reference
neps.search_spaces.hyperparameters.categorical.Categorical
#
Categorical(
choices: Iterable[float | int | str],
*,
default: float | int | str | None = None,
default_confidence: Literal[
"low", "medium", "high"
] = "low"
)
Bases: ParameterWithPrior[CategoricalTypes, CategoricalTypes]
A list of unordered choices for a parameter.
This kind of Parameter
is used
to represent hyperparameters that can take on a discrete set of unordered
values. For example, the optimizer
hyperparameter in a neural network
search space can be a Categorical
with choices like
["adam", "sgd", "rmsprop"]
.
Please see the Parameter
,
ParameterWithPrior
,
for more details on the methods available for this class.
PARAMETER | DESCRIPTION |
---|---|
choices
|
choices for the hyperparameter. |
default
|
default value for the hyperparameter, must be in |
default_confidence
|
confidence score for the default value, used when condsider prior based optimization.
TYPE:
|
Source code in neps/search_spaces/hyperparameters/categorical.py
load_from
#
load_from(value: Any) -> None
Load a serialized value into the hyperparameter's value.
PARAMETER | DESCRIPTION |
---|---|
value
|
value to load.
TYPE:
|
sample
#
sample(*, user_priors: bool = False) -> Self
Sample a new version of this Parameter
with a random value.
Similar to
Parameter.sample()
,
but a ParameterWithPrior
can use the confidence score by setting
user_priors=True
.
PARAMETER | DESCRIPTION |
---|---|
user_priors
|
whether to use the confidence score when sampling a value.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Self
|
A new |
Source code in neps/search_spaces/parameter.py
neps.search_spaces.hyperparameters.float.Float
#
Float(
lower: Number,
upper: Number,
*,
log: bool = False,
is_fidelity: bool = False,
default: Number | None = None,
default_confidence: Literal[
"low", "medium", "high"
] = "low"
)
A float value for a parameter.
This kind of Parameter
is used
to represent hyperparameters with continuous float values, optionally specifying if
it exists
on a log scale.
For example, l2_norm
could be a value in (0.1)
, while the learning_rate
hyperparameter in a neural network search space can be a Float
with a range of (0.0001, 0.1)
but on a log scale.
Please see the [Numerical
][neps.search_spaces.numerical.Numerical]
class for more details on the methods available for this class.
PARAMETER | DESCRIPTION |
---|---|
lower
|
lower bound for the hyperparameter.
TYPE:
|
upper
|
upper bound for the hyperparameter.
TYPE:
|
log
|
whether the hyperparameter is on a log scale.
TYPE:
|
is_fidelity
|
whether the hyperparameter is fidelity.
TYPE:
|
default
|
default value for the hyperparameter.
TYPE:
|
default_confidence
|
confidence score for the default value, used when condsidering prior based optimization..
TYPE:
|
Source code in neps/search_spaces/hyperparameters/float.py
load_from
#
load_from(value: Any) -> None
Load a serialized value into the hyperparameter's value.
PARAMETER | DESCRIPTION |
---|---|
value
|
value to load.
TYPE:
|
sample
#
sample(*, user_priors: bool = False) -> Self
Sample a new version of this Parameter
with a random value.
Similar to
Parameter.sample()
,
but a ParameterWithPrior
can use the confidence score by setting
user_priors=True
.
PARAMETER | DESCRIPTION |
---|---|
user_priors
|
whether to use the confidence score when sampling a value.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Self
|
A new |
Source code in neps/search_spaces/parameter.py
neps.search_spaces.hyperparameters.integer.Integer
#
Integer(
lower: Number,
upper: Number,
*,
log: bool = False,
is_fidelity: bool = False,
default: Number | None = None,
default_confidence: Literal[
"low", "medium", "high"
] = "low"
)
An integer value for a parameter.
This kind of Parameter
is used
to represent hyperparameters with continuous integer values, optionally specifying
f it exists on a log scale.
For example, batch_size
could be a value in (32, 128)
, while the num_layers
hyperparameter in a neural network search space can be a Integer
with a range of (1, 1000)
but on a log scale.
PARAMETER | DESCRIPTION |
---|---|
lower
|
lower bound for the hyperparameter.
TYPE:
|
upper
|
upper bound for the hyperparameter.
TYPE:
|
log
|
whether the hyperparameter is on a log scale.
TYPE:
|
is_fidelity
|
whether the hyperparameter is fidelity.
TYPE:
|
default
|
default value for the hyperparameter.
TYPE:
|
default_confidence
|
confidence score for the default value, used when condsider prior based optimization.
TYPE:
|
Source code in neps/search_spaces/hyperparameters/integer.py
sample
#
sample(*, user_priors: bool = False) -> Self
Sample a new version of this Parameter
with a random value.
Similar to
Parameter.sample()
,
but a ParameterWithPrior
can use the confidence score by setting
user_priors=True
.
PARAMETER | DESCRIPTION |
---|---|
user_priors
|
whether to use the confidence score when sampling a value.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Self
|
A new |
Source code in neps/search_spaces/parameter.py
neps.search_spaces.hyperparameters.constant.Constant
#
Bases: Parameter[T, T]
A constant value for a parameter.
This kind of Parameter
is used
to represent hyperparameters with values that should not change during
optimization. For example, the batch_size
hyperparameter in a neural
network search space can be a Constant
with a value of 32
.
Note
As the name suggests, the value of a Constant
only have one
value and so its [.default
][neps.search_spaces.parameter.Parameter.default]
and .value
should always be
the same.
This also implies that the
[.default
][neps.search_spaces.parameter.Parameter.default] can never be None
.
Please use
[.set_constant_value()
][neps.search_spaces.hyperparameters.constant.Constant.set_constant_value]
if you need to change the value of the constant parameter.
PARAMETER | DESCRIPTION |
---|---|
value
|
value for the hyperparameter.
TYPE:
|
Source code in neps/search_spaces/hyperparameters/constant.py
load_from
#
load_from(value: Any) -> None
Load a serialized value into the hyperparameter's value.
PARAMETER | DESCRIPTION |
---|---|
value
|
value to load.
TYPE:
|
sample
#
Sample a new version of this Parameter
with a random value.
Will set the .value
to the
sampled value.
RETURNS | DESCRIPTION |
---|---|
Self
|
A new |
Source code in neps/search_spaces/parameter.py
set_value
#
Set the value of the constant parameter.
Note
This method is a no-op but will raise a ValueError
if the value
is different from the current value.
Please see
[.set_constant_value()
][neps.search_spaces.hyperparameters.constant.Constant.set_constant_value]
which can be used to set both the
.value
and the [.default
][neps.search_spaces.parameter.Parameter.default] at once
PARAMETER | DESCRIPTION |
---|---|
value
|
value to set the parameter to.
TYPE:
|
RAISES | DESCRIPTION |
---|---|
ValueError
|
if the value is different from the current value. |
Source code in neps/search_spaces/hyperparameters/constant.py
Using your knowledge, providing a Prior#
When optimizing, you can provide your own knowledge using the parameters default=
.
By indicating a default=
we take this to be your user prior,
your knowledge about where a good value for this parameter lies.
You can also specify a default_confidence=
to indicate how strongly you want NePS,
to focus on these, one of either "low"
, "medium"
, or "high"
.
Currently the two major algorithms that exploit this in NePS are PriorBand
(prior-based HyperBand
) and PiBO
, a version of Bayesian Optimization which uses Priors.
import neps
neps.run(
...,
pipeline_space={
"learning_rate": neps.Float(1e-4, 1e-1, log=True, default=1e-2, default_confidence="medium"),
"num_epochs": neps.Integer(3, 30, is_fidelity=True),
"optimizer": neps.Categorical(["adam", "sgd", "rmsprop"], default="adam", default_confidence="low"),
"dropout_rate": neps.Constant(0.5),
}
)
Must set default=
for all parameters, if any
If you specify default=
for one parameter, you must do so for all your variables.
This will be improved in future versions.
Interaction with is_fidelity
If you specify is_fidelity=True
for one parameter, the default=
and default_confidence=
are ignored.
This will be dissallowed in future versions.
Defining a pipeline space using YAML#
Create a YAML file (e.g., ./pipeline_space.yaml
) with the parameter definitions following this structure.
When defining the pipeline_space
using a YAML file, if the type
argument is not specified,
the NePS will automatically infer the data type based on the value provided.
- If
lower
andupper
are provided, then if they are both integers, the type will be inferred asint
, otherwise asfloat
. You can provide scientific notation for floating-point numbers as well. - If
choices
are provided, the type will be inferred ascategorical
. - If just a numeric or string is provided, the type will be inferred as
constant
.
If none of these hold, an error will be raised.
Using ConfigSpace#
For users familiar with the ConfigSpace
library,
can also define the pipeline_space
through ConfigurationSpace()
from configspace import ConfigurationSpace, Float
configspace = ConfigurationSpace(
{
"learning_rate": Float("learning_rate", bounds=(1e-4, 1e-1), log=True)
"optimizer": ["adam", "sgd", "rmsprop"],
"dropout_rate": 0.5,
}
)
Warning
Parameters you wish to use as a fidelity are not support through ConfigSpace at this time.
For additional information on ConfigSpace and its features, please visit the following link.
Supported Architecture parameter Types#
A comprehensive documentation for the Architecture parameter is not available at this point.
If you are interested in exploring architecture parameters, you can find detailed examples and usage in the following resources:
- Basic Usage Examples - Basic usage examples that can help you understand the fundamentals of Architecture parameters.
- Experimental Examples - For more advanced and experimental use cases, including Hierarchical parameters, check out this collection of examples.
Warning
The configuration of pipeline_space
from a YAML file does not currently support architecture parameter types.