Activation Function 激活函数

Activation Function 激活函数

Intro

In parctice, we hardly ever use step function as activation function ϕ ( z ) \phi(z) ϕ(z)
Activation function can be divided into two types: saturated activation function and non-saturated activation function.
Non-saturated activation functions can avoid gradient disappearance and they converge faster.
Gradient disappearance means the value of function changes slightly if ∣ x ∣ |x| x is large.

Step function

ϕ ′ ( x ) = 0 \phi'(x) = 0 ϕ(x)=0, so we never use it.

Sigmoid Function

在这里插入图片描述
ϕ ( x ) = s i g m o i d ( x ) = 1 1 + e − x \phi(x) = sigmoid(x) = \frac{1}{1+e^{-x}} ϕ(x)=sigmoid(x)=1+ex1
ϕ ′ ( x ) = ϕ ( x ) [ 1 − ϕ ( x ) ] \phi'(x) = \phi(x)[1-\phi(x)] ϕ(x)=ϕ(x)[1ϕ(x)]
Sigmoid function is a saturated activation function.

tanh 双曲正切函数 (hyperbolic tangent funciton)

在这里插入图片描述
ϕ ( x ) = t a n h ( x ) = e x − e − x e x + e − x \phi(x) = tanh(x) = \frac{e^x-e^{-x}}{e^x+e^{-x}} ϕ(x)=tanh(x)=ex+exexex
ϕ ′ ( x ) = 1 − ϕ 2 ( x ) \phi'(x) = 1 - \phi^2(x) ϕ(x)=1ϕ2(x)
The tanh function is a saturated activation function.

ReLU 线性整流函数(Rectified Linear Unit)

在这里插入图片描述
ϕ ( x ) = { x   x > 0 0   x ≤ 0 \phi(x) = \left\{\begin{array}{l} x \ x>0 \\ 0 \ x\leq0 \end{array}\right. ϕ(x)={x x>00 x0
ϕ ( x ) = max ⁡ ( 0 , x ) \phi(x) = \max(0, x) ϕ(x)=max(0,x)
ϕ ′ ( x ) = { 1   x > 0 0   x ≤ 0 \phi'(x) = \left\{\begin{array}{l} 1 \ x>0 \\ 0 \ x\leq0 \end{array}\right. ϕ(x)={1 x>00 x0
ReLU function is a non-saturated activation function.

ReLU is widely used in deep learning.

Leaky ReLU, PReLU, RReLU

The Leaky ReLU, PReLU and RReLU functions all have a tiny angle when x < 0 x<0 x<0. They are non-saturated activation function.
ϕ ( x ) = { x   x > 0 x a   x ≤ 0 \phi(x) = \left\{\begin{array}{l} x \ x>0 \\ \frac xa \ x\leq0 \end{array}\right. ϕ(x)={x x>0ax x0
ϕ ( x ) = max ⁡ ( l e a k ∗ x , x ) \phi(x) = \max(leak*x, x) ϕ(x)=max(leakx,x)
ϕ ′ ( x ) = { 1   x > 0 1 a   x ≤ 0 \phi'(x) = \left\{\begin{array}{l} 1 \ x>0 \\ \frac1a \ x\leq0 \end{array}\right. ϕ(x)={1 x>0a1 x0
在这里插入图片描述

Leaky ReLU

Leak is a constant number in leaky ReLU.

PReLU(Parameteric Leaky ReLU)

Leak is a parameter in PReLU.

RReLU(Randomized Leaky ReLU)

Leak is a random value subjected to a given range. This value will be fixed in the test phase.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值