Exercise 1
Below, using Keras Sequential model and Dense Layer with a sigmoid activation to construct the network described above.
model = Sequential(
[
tf.keras.Input(shape=(400,)), #specify input size
### START CODE HERE ###
tf.keras.layers.Dense(units=25,activation='sigmoid'),
tf.keras.layers.Dense(units=15,activation='sigmoid'),
tf.keras.layers.Dense(units=1,activation='sigmoid'),
### END CODE HERE ###
], name = "my_model"
)
Exercise 2
Below, build a dense layer subroutine. The example in lecture utilized a for loop to visit each unit (j
) in the layer and perform the dot product of the weights for that unit (W[:,j]
) and sum the bias for the unit (b[j]
) to form z
. An activation function g(z)
is then applied to that result. This section will not utilize some of the matrix operations described in the optional lectures. These will be explored in a later section.
def my_dense(a_in, W, b, g):
"""
Computes dense layer
Args:
a_in (ndarray (n, )) : Data, 1 example
W (ndarray (n,j)) : Weight matrix, n features per unit, j units
b (ndarray (j, )) : bias vector, j units
g activation function (e.g. sigmoid, relu..)
Returns
a_out (ndarray (j,)) : j units
"""
units = W.shape[1]
a_out = np.zeros(units)
### START CODE HERE ###
for i in range(units):
wi=W[:,i]#将w的第i列提取出来,返回一个一维数组或者说是列向量
a_out[i]=g(np.dot(a_in,wi)+b[i])
#每一个神经元都会与所有输入进行运算,再加上这个神经元本身的偏置,经过激活函数,变成这个神经元的输出
#所有神经元的输出统合为本层输出
### END CODE HERE ###
return(a_out)
Exercise 3
Below, compose a new my_dense_v
subroutine that performs the layer calculations for a matrix of examples. This will utilize np.matmul()
.
Note: This function is not graded because it is discussed in the optional lectures on vectorization. If you didn’t go through them, feel free to click the hints below the expected code to see the code. You can also submit the notebook even with a blank answer here.
def my_dense_v(A_in, W, b, g):
"""
Computes dense layer
Args:
A_in (ndarray (m,n)) : Data, m examples, n features each
W (ndarray (n,j)) : Weight matrix, n features per unit, j units
b (ndarray (1,j)) : bias vector, j units
g activation function (e.g. sigmoid, relu..)
Returns
A_out (tf.Tensor or ndarray (m,j)) : m examples, j units
"""
### START CODE HERE ###
A_out=g(np.matmul(A_in,W)+b)
### END CODE HERE ###
return(A_out)