Skip to content

Neural Pipeline Search (NePS)#

PyPI version Python versions License Tests

Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: make HPO and NAS usable for deep learners in practice.

NePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups, with tools to analyze runs, restart runs, etc., all tailored to the needs of deep learning experts.

Key Features#

In addition to the features offered by traditional HPO and NAS libraries, NePS stands out with:

  1. Hyperparameter Optimization (HPO) Efficient Enough For Deep Learning:
    NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge, while also using many other efficiency boosters.
  2. Neural Architecture Search (NAS) with Expressive Search Spaces:
    NePS provides capabilities for designing and optimizing architectures in an expressive and natural fashion.
  3. Zero-effort Parallelization and an Experience Tailored to DL:
    NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed computing environments. As NePS is made for deep learners, all technical choices are made with DL in mind and common DL tools such as Tensorboard are embraced.

Tip

Check out:

Installation#

To install the latest release from PyPI run

pip install neural-pipeline-search

Basic Usage#

Using neps always follows the same pattern:

  1. Define a evalute_pipeline function capable of evaluating different architectural and/or hyperparameter configurations for your problem.
  2. Define a search space named pipeline_space of those Parameters e.g. via a dictionary
  3. Call neps.run to optimize evalute_pipeline over pipeline_space

In code, the usage pattern can look like this:

import neps
import logging


# 1. Define a function that accepts hyperparameters and computes the validation error
def evalute_pipeline(
        hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str
) -> dict:
    # Create your model
    model = MyModel(architecture_parameter)

    # Train and evaluate the model with your training pipeline
    validation_error = train_and_eval(
        model, hyperparameter_a, hyperparameter_b
    )
    return validation_error


# 2. Define a search space of parameters; use the same parameter names as in evalute_pipeline
pipeline_space = dict(
    hyperparameter_a=neps.Float(
        lower=0.001, upper=0.1, log=True  # The search space is sampled in log space
    ),
    hyperparameter_b=neps.Integer(lower=1, upper=42),
    architecture_parameter=neps.Categorical(["option_a", "option_b"]),
)

# 3. Run the NePS optimization
logging.basicConfig(level=logging.INFO)
neps.run(
    evalute_pipeline=evalute_pipeline,
    pipeline_space=pipeline_space,
    root_directory="path/to/save/results",  # Replace with the actual path.
    max_evaluations_total=100,
)

Examples#

Discover how NePS works through these examples:

Contributing#

Please see the documentation for contributors.

Citations#

For pointers on citing the NePS package and papers refer to our documentation on citations.