Getting Started#
Getting started with NePS involves a straightforward yet powerful process, centering around its three main components. This approach ensures flexibility and efficiency in evaluating different architecture and hyperparameter configurations for your problem.
NePS requires Python 3.10 or higher.
You can install it via pip
or from source.
The 3 Main Components#
- Establish a
pipeline_space=
:
pipeline_space={
"some_parameter": (0.0, 1.0), # float
"another_parameter": (0, 10), # integer
"optimizer": ["sgd", "adam"], # categorical
"epoch": neps.Integer(lower=1, upper=100, is_fidelity=True),
"learning_rate": neps.Float(lower=1e-5, upper=1, log=True),
"alpha": neps.Float(lower=0.1, upper=1.0, prior=0.99, prior_confidence="high")
}
- Define an
evaluate_pipeline()
function:
def evaluate_pipeline(some_parameter: float,
another_parameter: float,
optimizer: str, epoch: int,
learning_rate: float, alpha: float) -> float:
model = make_model(...)
loss = eval_model(model)
return loss
- Execute with
neps.run()
:
What's Next?#
The reference section provides detailed information on the individual components of NePS.
- How to use the
neps.run()
function to start the optimization process. - The different search space options available.
- How to choose and configure the optimizer used.
- Declarative usage of NePS via YAML configuration files.
- How to define the
evaluate_pipeline()
function. - How to use the CLI to run NePS from the command line.
- How to analyze the optimization runs.
Or discover the features of NePS through these practical examples:
-
Hyperparameter Optimization (HPO): Learn the essentials of hyperparameter optimization with NePS.
-
Multi-Fidelity Optimization: Understand how to leverage multi-fidelity optimization for efficient model tuning.
-
Utilizing Expert Priors for Hyperparameters: Learn how to incorporate expert priors for more efficient hyperparameter selection.
-
Additional NePS Examples: Explore more examples, including various use cases and advanced configurations in NePS.