- 博客(3)
- 收藏
- 关注
原创 手动分解反向传播,理解梯度消失和梯度爆炸
Let’s see a very simple handwriting formula derivation #Define Firstly, let define some variables and operations Gradient of the variable in layer L(last layer) dWL = dLoss * aL Gradient of the vari...
2019-05-27 13:21:22 304
原创 十个提高代码效率的python技巧
来自blog namedtuple If you are too lazy to create a class but you still want to use a variable that can act as a class object, then you should use namedtuple: from collections import namedtuple user = n...
2019-05-23 14:04:47 254
原创 Tensorflow 量化训练全过程
来自blog Tensorflow 量化训练全过程 You can either train your quantized model by restroing a ever trained floating point model or from scratch. In any cases, you have to firstly create a quantization training g...
2019-05-21 09:20:41 2458
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人