Machine Learning
文章平均质量分 95
檀檀吸甲烷
南京大学信息管理学院
展开
专栏收录文章
- 默认排序
- 最新发布
- 最早发布
- 最多阅读
- 最少阅读
-
Attention Mechanism
来自University of Waterloo 1 Attention Overview 1.1 RNN’s challenge Long range dependencies: how to deal with Combine it with some attention mechanism Gradient vanishing and explosion Large number of trainig steps We should unroll it for as many steps as ne原创 2021-10-05 01:10:03 · 336 阅读 · 0 评论 -
Machine_Learning_Regularization
In this chapter, we introduce regularization methods with some hands-on Python codes~ Machine_Learning_RegularizationRegularizationLassoRidgeElastic NetLasso: a real caseGridSearchCVLassoCVLassoLarsICSource: Regularization Why regularization: in the bias原创 2021-10-16 11:09:32 · 354 阅读 · 0 评论 -
神经网络激活函数
激活函数 激活函数的作用就是把输入节点的加权和转化后输出。 Sigmoid Sigmoid函数也被称为logistics函数。特点是将实数范围内的数字映射到(0,1)(0, 1)(0,1)之间,定义域内所有点的导数都是非负的。作为神经网络的激活函数时,多用于隐层神经元输出。 S(x)=11+ex S′(x)=ex(1+ex)2=S(x)⋅(1−S(x)) S(x) = \frac{1}{1+e^x} \\ \ \\ S'(x) = \frac{e^x}{(1+e^x)^2}=S(x) ·(1-S原创 2021-06-23 15:29:51 · 411 阅读 · 0 评论 -
Perceptrons and single layer neural nets
介绍了ANN,Threshold Perceptron Learning,Sigmoid Perceptron Learning,Perceptron Algorithm和一些定理。原创 2021-10-10 13:41:33 · 612 阅读 · 0 评论
分享