文章大纲
在本文中,我们使用含有两个隐藏层的神经网络基于MNIST数据集测试Sigmoid和Relu激活函数
Neural Network Rectified Linear Unit (ReLU) vs Sigmoid
Objective 目标如下
1. Define Several Neural Network, Criterion function, Optimizer.
2. Test Sigmoid and Relu.
3. Analyze Results.
In this lab, you will test Sigmoid and Relu activation functions on the MNIST dataset with two hidden Layers.
预估时间30分钟
We’ll need the following libraries
# Import the libraries we need for this lab
# Using the following line code to install the torchvision library
# !conda install -y torchvision

订阅专栏 解锁全文
467

被折叠的 条评论
为什么被折叠?



