tf.keras.activations.selu
tf.keras.activations.selu(x)
Defined in tensorflow/python/keras/activations.py
.
Scaled Exponential Linear Unit. (Klambauer et al., 2017).
Arguments:
x
: A tensor or variable to compute the activation function for.
Returns:
Tensor with the same shape and dtype as `x`.
@tf_export('keras.activations.selu')
def selu(x):
"""Scaled Exponential Linear Unit. (Klambauer et al., 2017).
Arguments:
x: A tensor or variable to compute the activation function for.
Returns:
Tensor with the same shape and dtype as `x`.
# Note
- To be used together with the initialization "lecun_normal".
- To be used together with the dropout variant "AlphaDropout".
"""
alpha = 1.6732632423543772848170429916717
scale = 1.0507009873554804934193349852946
return scale * K.elu(x, alpha)
Note
- To be used together with the initialization "lecun_normal".
- To be used together with the dropout variant "AlphaDropout".