View source on GitHub |
LeCun normal initializer.
tf.keras.initializers.lecun_normal(
seed=None
)
Initializers allow you to pre-specify an initialization strategy, encoded in the Initializer object, without knowing the shape and dtype of the variable being initialized.
Draws samples from a truncated normal distribution centered on 0 with stddev
= sqrt(1 / fan_in)
where fan_in
is the number of input units in the weight
tensor.
Examples:
def make_variables(k, initializer):
return (tf.Variable(initializer(shape=[k, k], dtype=tf.float32)),
tf.Variable(initializer(shape=[k, k, k], dtype=tf.float32)))
v1, v2 = make_variables(3, tf.initializers.lecun_normal())
v1
<tf.Variable ... shape=(3, 3) ...
v2
<tf.Variable ... shape=(3, 3, 3) ...
make_variables(4, tf.initializers.RandomNormal())
(<tf.Variable ... shape=(4, 4) dtype=float32...
<tf.Variable ... shape=(4, 4, 4) dtype=float32...
Arguments | |
---|---|
seed
|
A Python integer. Used to seed the random generator. |
Returns | |
---|---|
A callable Initializer with shape and dtype arguments which generates a
tensor.
|
References:
- Self-Normalizing Neural Networks, Klambauer et al., 2017 (pdf)
- Efficient Backprop, Lecun et al., 1998