目录
-
-
- Deep Learning
- Feedforward Neural Network 前馈神经网络
- Neuron 神经元
- Output Layer 输出层
- Optimization
- Regularization 正则化
- Topic Classification 主题分类
- Language Model as Classifiers 语言模型作为分类器
- Word Embeddings 词嵌入
- Training a Feed-Forward Neural Network Language Model 训练前向传播神经网络语言模型
- Feed-Forward Neural Network for POS Tagging 用于词性标注的前向传播神经网络
- Convolutional Networks 卷积网络
- Convolutional Networks for NLP
-
Feedforward Neural Networks Basics
Deep Learning
-
A branch of machine learning 机器学习的一个分支
-
Re-branded name for neural networks 神经网络的重命名
-
Deep: Many layers are chained together in modern deep learning models 深度:现在深度学习模型中的链式链接的多层
-
Neural Networks: Historically inspired by the way computation works in the brain 神经网络:历史上受大脑计算方式的启发
- Consists of computation units called neurons 由称为神经元的计算单元组成
Feedforward Neural Network 前馈神经网络
-
Also called multilayer perceptron 也被称为多层感知机
-
E.g. of architecture: 架构示例
-
Each arrow carries a weight, reflecting its importance 每个箭头都会携带一个权重,以反应其重要性
-
Certain layers have non-linear activation functions 某些层具有非线性的激活函数
Neuron 神经元
-
Each neuron is a function: 每个神经元是一个函数
- Given input
x
, compute real-valueh
: 给定输入x
,计算实值h
- Scales input (with weights,
w
) and adds offset (bias,b
) 通过权重w
缩放输入并添加偏移量(偏置,b
) - Applies non-linear function, such as logistic sigmoid, hyperbolic sigmoid(tanh), or rectified linear unit 应用非线性函数,如逻辑sigmoid、双曲sigmoid(tanh)或修正线性单元
w
andb
are parameters of the modelw
和b
是模型的参数
- Given input
-
Typically hava several hidden units 通常有几个隐藏单元. E.g.
-
Each with its own weights wi, and bias terms bi 每个单元都有自己的权重 wi,和偏置项 bi
-
Can be expressed using matrix and vector operators 可以使用矩阵和向量运算符表示:
-
Where
is a matrix comprising the weight vectors and 是一个包含权重向量的矩阵
is a vector of all bias terms 是所有偏置项的向量
-
Non-linear function applied element-wise 非线性函数逐元素应用
-
Output Layer 输出层
-
For binary classification problem: Sigmoid Activation Function 对于二元分类问题:Sigmoid 激活函数
-
Multi-class classification problem: Softmax Activation Function ensures probab