版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.youkuaiyun.com/wangkun1340378/article/details/78422926 </div>
<link rel="stylesheet" href="https://csdnimg.cn/release/phoenix/template/css/ck_htmledit_views-cd6c485e8b.css">
<link rel="stylesheet" href="https://csdnimg.cn/release/phoenix/template/css/ck_htmledit_views-cd6c485e8b.css">
<div class="htmledit_views" id="content_views">
caffe程序中,有时候我们不需要更新某些层,那么我们该怎么做呢?
假设共有5个卷积层,conv1,conv2,conv3,conv4,conv5
为了方便,假设只有这5个卷积层,没有pool层与relu层
卷积层的定义为
layer
{
name:"conv1"
type:"Convolution"
bottom: "data"
top: "conv1"
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
}
}
}
1.如果希望conv1,conv2,conv3都不更新,
那么在conv4的layer定义中,可以添加一句propagate_down : 0
即将层定义修改为:
layer
{
name:"conv4"
type:"Convolution"
bottom: "conv3"
top: "conv4"
propagate_down : 0
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
}
}
}
通过增加的propagate_down : 0,反向传播会终止在conv4层,忽略conv1,conv2,conv3
2.如果只是希望conv3不更新,conv1,conv2与conv4,conv5都更新:
那么可以将conv3的学习率设为0
即:
layer {
name: "conv3"
type: "Convolution"
bottom: "conv2"
top: "conv3"
param {
lr_mult: 0.000000
}
param {
lr_mult: 0.000000
}
convolution_param {
num_output: 64
kernel_size: 3
stride: 1
pad: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
}
}
}
将weight_filter与bias_filter的学习率均设为0即可
参考:https://zhidao.baidu.com/question/363059557656952932.html