n-dimensional
the negative slope of the rectifier used after this layer (only used with 'leakyRelu')
either 'fanIn' (default) or 'fanOut'. Choosing 'fanIn'
preserves the magnitude of the variance of the weights in the
forward pass. Choosing 'fanOut' preserves the magnitudes in the
backwards pass.
the non-linear function,
recommended to use only with 'relu' or 'leakyRelu' (default).
Generated using TypeDoc
Fills the input
Tensorwith values according to the method described inDelving deep into rectifiers: Surpassing human-level performance on ImageNet classification- He, K. et al. (2015), using a uniform distribution.The resulting tensor will have values sampled from
U(-bound, bound)wherebound = gain * sqrt(3/fanMode)Also known as He initialization.
This is the default initializer for
ConvandLinearlayers.