Skip to content

RSWS

Bases: OneShotNASOptimizer

Implementation of the Random NAS optimization algorithm with weight sharing.

Random NAS (Neural Architecture Search) is a stochastic technique for optimizing the architecture of a neural network. This class inherits from the OneShotNASOptimizer and modifies the architecture and operation (op) weights based on random sampling.

Based on the paper: Random Search and Reproducibility for Neural Architecture Search by Li et al. 2019.

add_alphas(edge) staticmethod

Adds architectural weights to edges in the neural network.

Parameters:

Name Type Description Default
edge object

The edge in the neural network to which the architectural weights are to be added.

required
Note

The architectural weights are added as a PyTorch Parameter and set to the edge data.

sample_random_and_update_alphas()

Samples a random architecture and updates the alpha values accordingly.

Note

This method utilizes a temporary graph clone for sampling and sets the alpha values based on the sampled architecture.

step(data_train, data_val)

Performs one optimization step to update both architecture and operation weights.

Parameters:

Name Type Description Default
data_train tuple

Tuple containing training data and labels.

required
data_val tuple

Tuple containing validation data and labels.

required

Returns:

Name Type Description
tuple

A tuple containing logits for the training data, logits for the validation data, loss for the training data, and loss for the validation data.