Activation layer has 0 parameters, it is ok.

本文解释了Keras中Activation层的作用,它不包含可训练参数,仅应用固定的ReLU函数。其他如Conv2D、BatchNormalization等层则有可学习的权重和偏置。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

The Activation layer in Keras does not have any trainable parameters, which is why you see 0 in the Param # column for this layer in the model summary.

The Activation layer is used to apply an activation function to the output of the previous layer. It does not have weights or biases because it does not transform the data in a way that depends on the specific data points in the training set. Instead, it applies a fixed mathematical function (in this case, the ReLU function) to the input data. This is why it does not have any parameters to learn during training.

Here’s what each layer in your model does:

  • Conv2D: This is a convolutional layer. It applies a convolution operation to the input, passing the result to the next layer. It has trainable weights and biases.

  • BatchNormalization: This layer normalizes the activations of the previous layer, reducing the amount by which the hidden unit values shift around (covariate shift). It has trainable parameters (gamma and beta).

  • Activation('relu'): This layer applies the ReLU activation function to the output of the previous layer. It does not have any trainable parameters.

  • MaxPool2D: This layer applies max pooling to the input, reducing its dimensionality. It does not have any trainable parameters.

  • Dropout: This layer applies dropout to the input, setting a fraction of the input units to 0 at each update during training time, which helps prevent overfitting. It does not have any trainable parameters.

  • Flatten: This layer flattens the input, does not have any trainable parameters.

  • Dense: This is a fully connected layer. It has trainable weights and biases.

So, the Activation layer has 0 parameters because it does not learn any weights or biases during training. It simply applies the ReLU function to its input.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值