alien.models.keras package

Submodules

alien.models.keras.keras module

Model classes for wrapping Keras-based models into ALiEN format.

class alien.models.keras.keras.KerasRegressor(model=None, X=None, y=None, **kwargs)[source]

Bases: MCDropoutRegressor

Base Class for wrapped Keras regression models.

initialize(init_seed=None, sample_input=None)[source]

(Re)initializes the model weights. If self.reinitialize is True, this should be called at the start of every fit(), and this should be the default behaviour of fit().

get_weights()[source]

Return model weights.

set_weights(initial_weights=None)[source]

Set model weights according to initial_weights

save_initial_weights(sample_input=None)[source]

Save initial weights in self.initial_weights object.

fit_model(X=None, y=None, **kwargs)[source]

Fit just the model component, and not the uncertainties (if these are computed separately)

predict(X, **kwargs)[source]

Applies the model to input(s) X (with the last self.ndim axes corresponding to each sample), and returns prediction(s).

Parameters:

return_std_dev – if True, returns a tuple (prediction, std_dev)

fix_dropouts()[source]

Retools dropouts for MC dropout prediction

predict_samples(X, n=1, multiple=1.0)[source]

Makes a prediction for for the batch X, randomly selected from this model’s posterior distribution. Gives an ensemble of predictions, with shape (len(X), n).

covariance(X)

Returns the covariance of the epistemic uncertainty between all rows of X. This is where memory bugs often appear, because of the large matrices involved.

covariance_ensemble(X: _SupportsArray[dtype] | _NestedSequence[_SupportsArray[dtype]] | bool | int | float | complex | str | bytes | _NestedSequence[bool | int | float | complex | str | bytes])

Compute covariance from the ensemble of predictions

property data
fit(X=None, y=None, reinitialize=None, fit_uncertainty=True, **kwargs)

Fits the model to the given training data. If X and y are not specified, this method looks for self.X and self.y. If fit() finds an X but not a y, it treats X as a combined dataset data, and then uses X, y = data.X, data.y. If we can’t find data.X and data.y, we instead use X, y = data[:-1], data[-1].

fit() should also fit any accompanying uncertainty model.

Parameters:
  • reinitialize – If True, reinitializes model weights before fitting. If False, starts training from previous weight values. If not specified, uses self.reinitialize)

  • fit_uncertainty – If True, a call to fit() will also call fit_uncertainty(). Defaults to True.

fit_uncertainty(X=None, y=None)

Fit just the uncertainties (if these need additional fitting beyond just the model)

static load(path)

Loads a model. This particular implementation only works if save(path) hasn’t been overloaded.

property ndim

The number of axes in the feature space. Equal to len(self.shape). Most commonly equal to 1. If training data have been specified, then self.ndim == X.ndim - 1.

This property is used by any methods which use the @flatten_batch decorator.

predict_ensemble(X, **kwargs)

Returns an ensemble of predictions.

Parameters:

multiple – standard deviation should be this much larger

save(path)

Saves the model. May well be overloaded by subclasses, if they contain non-picklable components (or pickling would be inefficient).

For any subclass, the save() and load() methods should be compatible with each other.

property shape

The shape of the feature space. Can either be specified directly, or inferred from training data, in which case self.shape == X.shape[1:], i.e., the first (batch) dimension is dropped.

This property is used by any methods which use the @flatten_batch decorator.

std_dev(X, **kwargs)

Returns the (epistemic) standard deviation of the model on input X.

std_dev_ensemble(X)

Returns the (epistemic) standard deviation of the model on input X.

alien.models.keras.utils module

Helper functions for

alien.models.keras.utils.dropout_call(self, inputs, training=None)[source]

If training is True but not 1, uses dropout according to self.noise_dims. If training is 1, uses dropout, but holds it fixed along the batch.

alien.models.keras.utils.humble_batchnorm_call(self, inputs, training=None)[source]
alien.models.keras.utils.get_mod_layers(mod)[source]
alien.models.keras.utils.subobjects(module, skip=frozenset({}), only_layers=True)[source]

Traverses a module and all of its components.

Parameters:
  • module (keras.Model) – module to traverse

  • skip (Container) – A collection of modules to skip (along with their submodules)

  • only_layers – If True, only yields objects which are actually Keras layers.

Returns:

(bool) Whether or not it encountered any of the modules in

skip

alien.models.keras.utils.modify_dropout(obj)[source]

If obj is a Dropout, retools it to do properly correlated dropout inference.

Returns:

whether obj is a Dropout

Return type:

bool

Module contents

Module for Keras model wrappers.