Test-第三周测试题
1.Which of the falloning are true?(Check all that apply)
result:null
2.The tanh activation is not always better than sigmoid activation function for hidden units because the mean of its output is closer to zero,and so it centers the datas, making leaming complex for the next layer. True/Fatse?
A.True
B.False
3.Which of the following is a correct vectorized implementation of forward propapation for layer 2 ?
result:B
4. You are