- 博客(3)
- 收藏
- 关注
原创 手动分解反向传播,理解梯度消失和梯度爆炸
Let’s see a very simple handwriting formula derivation#DefineFirstly, let define some variables and operationsGradient of the variable in layer L(last layer)dWL = dLoss * aLGradient of the vari...
2019-05-27 13:21:22
355
原创 十个提高代码效率的python技巧
来自blognamedtupleIf you are too lazy to create a class but you still want to use a variable that can act as a class object, then you should use namedtuple:from collections import namedtupleuser = n...
2019-05-23 14:04:47
295
原创 Tensorflow 量化训练全过程
来自blogTensorflow 量化训练全过程You can either train your quantized model by restroing a ever trained floating point model or from scratch. In any cases, you have to firstly create a quantization training g...
2019-05-21 09:20:41
2526
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人