BatchNorm#

class myoverse.transforms.BatchNorm(eps=1e-05, **kwargs)[source]#

Batch normalization (normalize over batch dimension).

Note: This is a stateless version for inference. For training with running statistics, use torch.nn.BatchNorm1d.

Parameters:

eps (float) – Small value for numerical stability.

Examples

>>> x = torch.randn(32, 64, 200, names=('batch', 'channel', 'time'))
>>> bnorm = BatchNorm()
>>> y = bnorm(x)  # Normalized over batch dimension

Methods

__init__([eps])

_apply(x)

Apply the transform.