卷积层大小计算公式

Given:

  • our input layer has a width of W and a height of H
  • our convolutional layer has a filter size F
  • we have a stride of S
  • a padding of P
  • and the number of filters K,

the following formula gives us the width of the next layer: W_out =[ (W−F+2P)/S] + 1.

The output height would be H_out = [(H-F+2P)/S] + 1.

And the output depth would be equal to the number of filters D_out = K.

The output volume would be W_out * H_out * D_out.

 

 

 

Being able to calculate the number of parameters in a neural network is useful since we want to have control over how much memory a neural network uses.

Setup

H = height, W = width, D = depth

  • We have an input of shape 32x32x3 (HxWxD)
  • 20 filters of shape 8x8x3 (HxWxD)
  • A stride of 2 for both the height and width (S)
  • Zero padding of size 1 (P)

Output Layer

  • 14x14x20 (HxWxD)

Solution

There are 756560 total parameters. That's a HUGE amount! Here's how we calculate it:

(8 * 8 * 3 + 1) * (14 * 14 * 20) = 756560

8 * 8 * 3 is the number of weights, we add 1 for the bias. Remember, each weight is assigned to every single part of the output (14 * 14 * 20). So we multiply these two numbers together and we get the final answer.

 

 

Now we'd like you to calculate the number of parameters in the convolutional layer, if every neuron in the output layer shares its parameters with every other neuron in its same channel.

This is the number of parameters actually used in a convolution layer (tf.nn.conv2d()).

Setup

H = height, W = width, D = depth

  • We have an input of shape 32x32x3 (HxWxD)
  • 20 filters of shape 8x8x3 (HxWxD)
  • A stride of 2 for both the height and width (S)
  • Zero padding of size 1 (P)

Output Layer

  • 14x14x20 (HxWxD)

Hint

With parameter sharing, each neuron in an output channel shares its weights with every other neuron in that channel. So the number of parameters is equal to the number of neurons in the filter, plus a bias neuron, all multiplied by the number of channels in the output layer.

Solution

There are 3860 total parameters. That's 196 times fewer parameters! Here's how the answer is calculated:

(8 * 8 * 3 + 1) * 20 = 3840 + 20 = 3860

That's 3840 weights and 20 biases. This should look similar to the answer from the previous quiz. The difference being it's just 20 instead of (14 * 14 * 20). Remember, with weight sharing we use the same filter for an entire depth slice. Because of this we can get rid of 14 * 14 and be left with only 20.

 

 

### 卷积神经网络中卷积层参数数量的计算 在卷积神经网络(CNN)中,卷积层的参数主要由滤波器(kernel)决定。这些参数的数量可以通过以下公式进行计算: #### 参数数量计算公式 对于单个卷积核而言,其参数数量可以表示为: \[ \text{参数数量} = (k_h \times k_w \times d_{\text{in}}) + b \] 其中: - \( k_h \): 卷积核的高度。 - \( k_w \): 卷积核的宽度。 - \( d_{\text{in}} \): 输入通道数(即上一层的输出通道数或输入图像的颜色通道数,例如 RGB 图像有 3 个通道)。 - \( b \): 偏置项的数量,默认情况下每个卷积核对应一个偏置。 如果该卷积层中有多个卷积核,则总参数数量为: \[ \text{总参数数量} = m \times ((k_h \times k_w \times d_{\text{in}}) + 1) \] 这里 \( m \) 表示卷积核的数量(也称为输出通道数)。注意,在某些实现中可能不使用偏置项,此时需去掉 \( +1 \)[^1]。 #### 示例说明 假设有一个卷积层配置如下: - 输入尺寸:\( 28 \times 28 \times 3 \) (高度×宽度×通道) - 卷积大小:\( 3 \times 3 \) - 输出通道数:64 - 使用偏置项 则此卷积层的参数总数为: \[ \text{总参数数量} = 64 \times ((3 \times 3 \times 3) + 1) = 64 \times (27 + 1) = 64 \times 28 = 1792 \][^3] #### 特殊情况考虑 当卷积层采用深度可分离卷积(Depthwise Separable Convolution)时,参数数量会显著减少。这是因为深度可分离卷积分为两步完成:第一步是对每个输入通道单独应用卷积操作;第二步是跨通道线性组合结果。这种设计减少了传统卷积所需的大量参数[^1]。 ```python def calculate_conv_params(kernel_height, kernel_width, input_channels, output_channels, use_bias=True): params_per_filter = kernel_height * kernel_width * input_channels if use_bias: total_params = output_channels * (params_per_filter + 1) else: total_params = output_channels * params_per_filter return total_params # 计算示例 total_parameters = calculate_conv_params(3, 3, 3, 64) print(f"Total parameters: {total_parameters}") ```
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值