Module: Cell_BLAST.prob

Probabilistic / decoder modules for DIRECTi

Classes:

LN(output_dim, full_latent_dim[, h_dim, ...])

Build a Log Normal generative module.

MSE(*args, **kwargs)

NB(output_dim, full_latent_dim[, h_dim, ...])

Build a Negative Binomial generative module.

ProbModel(output_dim, full_latent_dim[, ...])

Abstract base class for generative model modules.

ZILN(output_dim, full_latent_dim[, h_dim, ...])

Build a Zero-Inflated Log Normal generative module.

ZINB(output_dim, full_latent_dim[, h_dim, ...])

Build a Zero-Inflated Negative Binomial generative module.

class Cell_BLAST.prob.LN(output_dim, full_latent_dim, h_dim=128, depth=1, dropout=0.0, lambda_reg=0.0, fine_tune=False, deviation_reg=0.0, name='LN', _class='LN', **kwargs)[source]

Build a Log Normal generative module.

Parameters:
  • output_dim (int) – Dimensionality of the output tensor.

  • full_latent_dim (Tuple[int]) – Dimensionality of the latent variable and Numbers of batches.

  • h_dim (int) – Dimensionality of the hidden layers in the decoder MLP.

  • depth (int) – Number of hidden layers in the decoder MLP.

  • dropout (float) – Dropout rate.

  • lambda_reg (float) – NOT USED.

  • fine_tune (bool) – Whether the module is used in fine-tuning.

  • deviation_reg (float) – Regularization strength for the deviation from original model weights.

  • name (str) – Name of the module.

Methods:

forward(full_x, feed_dict)

Defines the computation performed at every call.

forward(full_x, feed_dict)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses. :rtype: Tensor

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class Cell_BLAST.prob.MSE(*args, **kwargs)[source]
class Cell_BLAST.prob.NB(output_dim, full_latent_dim, h_dim=128, depth=1, dropout=0.0, lambda_reg=0.0, fine_tune=False, deviation_reg=0.0, name='NB', _class='NB', **kwargs)[source]

Build a Negative Binomial generative module.

Parameters:
  • output_dim (int) – Dimensionality of the output tensor.

  • full_latent_dim (Tuple[int]) – Dimensionality of the latent variable and Numbers of batches.

  • h_dim (int) – Dimensionality of the hidden layers in the decoder MLP.

  • depth (int) – Number of hidden layers in the decoder MLP.

  • dropout (float) – Dropout rate.

  • lambda_reg (float) – Regularization strength for the generative model parameters. Here log-scale variance of the scale parameter is regularized to improve numerical stability.

  • fine_tune (bool) – Whether the module is used in fine-tuning.

  • deviation_reg (float) – Regularization strength for the deviation from original model weights.

  • name (str) – Name of the module.

Methods:

forward(full_x, feed_dict)

Defines the computation performed at every call.

forward(full_x, feed_dict)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses. :rtype: Tensor

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class Cell_BLAST.prob.ProbModel(output_dim, full_latent_dim, h_dim=128, depth=1, dropout=0.0, lambda_reg=0.0, fine_tune=False, deviation_reg=0.0, name='ProbModel', _class='ProbModel', **kwargs)[source]

Abstract base class for generative model modules.

class Cell_BLAST.prob.ZILN(output_dim, full_latent_dim, h_dim=128, depth=1, dropout=0.0, lambda_reg=0.0, fine_tune=False, deviation_reg=0.0, name='ZILN', _class='ZILN', **kwargs)[source]

Build a Zero-Inflated Log Normal generative module.

Parameters:
  • output_dim (int) – Dimensionality of the output tensor.

  • full_latent_dim (Tuple[int]) – Dimensionality of the latent variable and Numbers of batches.

  • h_dim (int) – Dimensionality of the hidden layers in the decoder MLP.

  • depth (int) – Number of hidden layers in the decoder MLP.

  • dropout (float) – Dropout rate.

  • lambda_reg (float) – NOT USED.

  • fine_tune (bool) – Whether the module is used in fine-tuning.

  • deviation_reg (float) – Regularization strength for the deviation from original model weights.

  • name (str) – Name of the module.

Methods:

forward(full_x, feed_dict)

Defines the computation performed at every call.

forward(full_x, feed_dict)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses. :rtype: Tensor

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class Cell_BLAST.prob.ZINB(output_dim, full_latent_dim, h_dim=128, depth=1, dropout=0.0, lambda_reg=0.0, fine_tune=False, deviation_reg=0.0, name='ZINB', _class='ZINB', **kwargs)[source]

Build a Zero-Inflated Negative Binomial generative module.

Parameters:
  • output_dim (int) – Dimensionality of the output tensor.

  • full_latent_dim (Tuple[int]) – Dimensionality of the latent variable and Numbers of batches.

  • h_dim (int) – Dimensionality of the hidden layers in the decoder MLP.

  • depth (int) – Number of hidden layers in the decoder MLP.

  • dropout (float) – Dropout rate.

  • lambda_reg (float) – Regularization strength for the generative model parameters. Here log-scale variance of the scale parameter is regularized to improve numerical stability.

  • fine_tune (bool) – Whether the module is used in fine-tuning.

  • deviation_reg (float) – Regularization strength for the deviation from original model weights.

  • name (str) – Name of the module.

Methods:

forward(full_x, feed_dict)

Defines the computation performed at every call.

forward(full_x, feed_dict)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses. :rtype: Tensor

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.