This article is inspired by 这里 and 这里.
- 激活函数的主要意义是为NN加入非线性的元素。在神经学上模仿的是一个神经元是否有效。
A Neural Network without Activation function would simply be a Linear regression Model. Neural-Networks are considered Universal Function Approximators. It means that they can compute and learn any function at all. Almost any process we can think of can be represented as a functional computation in Neural Networks.
- 激活函数must be 'differentiable' for gradient descent.
- 一些常见的Function
- Sigmoid (deprecated). Output in