smac.epm.gaussian_process.augmented

Classes

AugmentedLocalGaussianProcess(X_in, y_in, ...)

GloballyAugmentedLocalGaussianProcess(...[, ...])

VariationalGaussianProcess(kernel, X_inducing)

A variational GP to compute the position of the inducing points.

class smac.epm.gaussian_process.augmented.AugmentedLocalGaussianProcess(X_in, y_in, X_out, y_out, likelihood, base_covar_kernel)[source]

Bases: gpytorch.models.exact_gp.ExactGP

forward(x)[source]

Compute the prior values. If optimize_kernel_hps is set True in the training phases, this model degenerates to a vanilla GP model with ZeroMean and base_covar as covariance matrix. Otherwise, we apply partial sparse GP mean and kernels here.

Return type

MultivariateNormal

set_augment_module(X_inducing)[source]

Set an augmentation module, which will be used later for inference

Parameters

X_inducing (torch.Tensor(N_inducing, D)) – inducing points, it needs to have the same number of dimensions as X_in

Return type

None

class smac.epm.gaussian_process.augmented.GloballyAugmentedLocalGaussianProcess(configspace, types, bounds, bounds_cont, bounds_cat, seed, kernel, num_inducing_points=2, likelihood=None, normalize_y=True, n_opt_restarts=10, instance_features=None, pca_components=None)[source]

Bases: smac.epm.gaussian_process.gpytorch.GPyTorchGaussianProcess

update_attribute(**kwargs)[source]

We update the class attribute (for instance, number of inducing points)

Return type

None

class smac.epm.gaussian_process.augmented.VariationalGaussianProcess(kernel, X_inducing)[source]

Bases: gpytorch.models.approximate_gp.ApproximateGP

A variational GP to compute the position of the inducing points. We only optimize for the position of the continuous dimensions and keep the categorical dimensions constant.

forward(x)[source]

Pass the posterior mean and variance given input X

Parameters

x (torch.Tensor) – Input data

Return type

MultivariateNormal