转自:http://blog.youkuaiyun.com/lanxuecc/article/details/52712085
opencv中adaboost训练强分类器源码。
adaboost训练强分类器的基本流程:
1、初始化训练样本的类别与权重分布。
2、迭代循环训练弱分类器。
3、将每次循环训练成的弱分类器与已经存在的弱分类器组成的强分类器。
4、根据当前强分类器计算正样本置信度,根据传入参数minhitrate来取得强分类器阈值。
5、用当前强分类器与上步计算得到阈值,分类负样本,如果分类的最大检率小于maxfalsealarm。跳出循环,强分类器训练完成。
下面结合一般的adaboost的算法原理、opencv源码、记录下我对adaboost的算法流程的详细理解:
给定一个训练数据集
,其中
属于标记样本是正样本还是负样本的标记集合{-1,+1},一般-1表示负样本,+1表示正样本。在人脸分类器的训练中,
是计算出来的正样本人脸图片与负样本非人脸图片的某个haar特征的集合。Adaboost的目的就是从训练数据集T中学习一系列弱分类器(特征表述+特征阈值),然后将这些弱分类器组合成一个强分类器(弱分类器+阈值)来尽可能准确的分类xi以达到能够分类一个新的样本
是正样本还是负样本。
在opencv中函数训练强分类器的函数是:icvCreateCARTStageClassifier。这里要特别说明下,在opencv中为训练强分类器提供了四种方法分别是Discrete AdaBoost(DAB)、Real AdaBoost(RAB)、Logit Boost(LB)和Gentle AdaBoost(GAB)算法。我们通常使用的较多的都是GAB训练算法,也是最简单的,这里是几种训练算法的区别介绍::http://www.cnblogs.com/jcchen1987/p/4581651.html
这里就DAB算法逻辑做一个介绍::::::::::::::::::
1、首先,初始化训练数据的权值分布。每一个训练样本最开始时都被赋予相同的权值:1/N。

opencv源码是在函数cvBooWstStartTraining中对训练的样本数据集进行权重的初始化,以及标识每个样本来初始化数据集
2、进入循环,进行多轮迭代,把每次迭代记做m
A、通过函数 cvTrimWeights来剔除小权值的样本:对实际存在的样本按权重的大小排序,找到权重高于总权重的weightfraction倍的样本保留下来,用来训练接下来的弱分类器。
B、使用前面一步保留下来的权值分布为Dm样本集学习训练,得到一个弱分类器,在函数cvCreateCARTClassifier中实现,记公式如下:

具体如何训练弱分类,见前面几个博客笔记:cvCreateCARTClassifier
C、用上步训练出来的弱分类器Gm(x)计算每个样本的分类置信度(具体调用的是函数icvEvalCARTHaarClassifier这里源码中用的是函数指针有点难找到,具体可以看下面我贴出的代码中的注释),将计算出来的每个样本的置信度传入cvBoostNextWeakClassifier函数计算该弱分类器Gm(X)在数据集上的分类误差em,这里四种不同的分类器训练方法有不同的计算方式,其中DAB的计算方法如下,在函数icvBoostNextWeakClassifierDAB中实现:

D、在上述函数icvBoostNextWeakClassifierDAB中紧接着用公式
计算了当前弱分类器Gm(X)在当前强分类器中的重要程序,这个公式意味着分类误差率越小的弱分类器在强分类器中的权重越高,作用越大。
E、在函数icvBoostNextWeakClassifierDAB中接着遍历所有样本,更新训练样本的数据集的权值分布得到Dm+1用于下轮迭代来训练弱分类器,公式如下:
其中:Zm是一个规一化因子:

通俗的说就是:遍历每个样本更新其权重Wm+1,i=Wm,i*exp(…) , 同时累加每个样本计算得到的权重Wm+1,i得到Zm,然后规一化下。通过公式可以看出当Gm(xi)分类这个样本正确时,则计算得到的Wm+1,i比Wm,i大,反之,Wm+1,i比Wm,i小,这样就实现了将本次分类正确的样本权值减小,本次分类错误的样本权值增大,这样在下次训练弱分类器里更多的聚焦于本次被分错的样本。
F、从函数icvBoostNextWeakClassifierDAB返回后,得到了当前弱分类器的权重。将该弱分类器以如下公式的方式加入到当前强分类器中:

G、紧接着,在函数icvCreateCARTStageClassifier中分开处理正负样本,遍历所有正样本,用当前强分类器中已经训练出来的弱分类器来计算每个正样本的置信度,累加得到置信度累加和,将这些每个样本的置信度累加和排序,根据minhitrate来计算当前强分类器的阈值threshold。
H、再遍历所有负样本,用当前强分类器中已经训练出来的弱分类器来计算每个负样本的置信度累加和,再用前面计算得到的threshold来判断每个负样本的类别,统计负样本的分类总数,得到负样本的误检率falsealarm,如果误检率小于输入的参数maxfalsealarm,则跳出迭代循环,当前强分类器训练完成!!!!
下面上源码和注释(结合代码与原理更容易理解)::::
static
CvIntHaarClassifier* icvCreateCARTStageClassifier( CvHaarTrainingData* data,
CvMat* sampleIdx,
CvIntHaarFeatures* haarFeatures,
float minhitrate,
float maxfalsealarm,
int symmetric,
float weightfraction,
int numsplits,
CvBoostType boosttype,
CvStumpError stumperror,
int maxsplits )
{
#ifdef CV_COL_ARRANGEMENT
int flags = CV_COL_SAMPLE;
#else
int flags = CV_ROW_SAMPLE;
#endif
CvStageHaarClassifier* stage = NULL;
CvBoostTrainer* trainer;
CvCARTClassifier* cart = NULL;
CvCARTTrainParams trainParams;
CvMTStumpTrainParams stumpTrainParams;
CvMat eval;
int n = 0;
int m = 0;
int numpos = 0;
int numneg = 0;
int numfalse = 0;
float sum_stage = 0.0F;
float threshold = 0.0F;
float falsealarm = 0.0F;
CvMat* trimmedIdx;
CvUserdata userdata;
int i = 0;
int j = 0;
int idx;
int numsamples;
int numtrimmed;
CvCARTHaarClassifier* classifier;
CvSeq* seq = NULL;
CvMemStorage* storage = NULL;
CvMat* weakTrainVals;
float alpha;
float sumalpha;
int num_splits;
#ifdef CV_VERBOSE
printf( "+----+----+-+---------+---------+---------+---------+\n" );
printf( "| N |%%SMP|F| ST.THR | HR | FA | EXP. ERR|\n" );
printf( "+----+----+-+---------+---------+---------+---------+\n" );
#endif /* CV_VERBOSE */
n = haarFeatures->count;
m = data->sum.rows;
numsamples = (sampleIdx) ? MAX( sampleIdx->rows, sampleIdx->cols ) : m;
userdata = cvUserdata( data, haarFeatures );
stumpTrainParams.type = ( boosttype == CV_DABCLASS )
? CV_CLASSIFICATION_CLASS : CV_REGRESSION;
stumpTrainParams.error = ( boosttype == CV_LBCLASS || boosttype == CV_GABCLASS )
? CV_SQUARE : stumperror;
stumpTrainParams.portion = CV_STUMP_TRAIN_PORTION;
stumpTrainParams.getTrainData = icvGetTrainingDataCallback;
stumpTrainParams.numcomp = n;
stumpTrainParams.userdata = &userdata;
stumpTrainParams.sortedIdx = data->idxcache;
trainParams.count = numsplits;
trainParams.stumpTrainParams = (CvClassifierTrainParams*) &stumpTrainParams;
trainParams.stumpConstructor = cvCreateMTStumpClassifier;
trainParams.splitIdx = icvSplitIndicesCallback;
trainParams.userdata = &userdata;
eval = cvMat( 1, m, CV_32FC1, cvAlloc( sizeof( float ) * m ) );
storage = cvCreateMemStorage();
seq = cvCreateSeq( 0, sizeof( *seq ), sizeof( classifier ), storage );
weakTrainVals = cvCreateMat( 1, m, CV_32FC1 );
trainer = cvBoostStartTraining( &data->cls, weakTrainVals, &data->weights,
sampleIdx, boosttype );
num_splits = 0;
sumalpha = 0.0F;
do
{
#ifdef CV_VERBOSE
int v_wt = 0;
int v_flipped = 0;
#endif /* CV_VERBOSE */
trimmedIdx = cvTrimWeights( &data->weights, sampleIdx, weightfraction );
numtrimmed = (trimmedIdx) ? MAX( trimmedIdx->rows, trimmedIdx->cols ) : m;
#ifdef CV_VERBOSE
v_wt = 100 * numtrimmed / numsamples;
v_flipped = 0;
#endif /* CV_VERBOSE */
cart = (CvCARTClassifier*) cvCreateCARTClassifier( data->valcache,
flags,
weakTrainVals,
0,
0,
0,
trimmedIdx,
&(data->weights),
(CvClassifierTrainParams*) &trainParams );
classifier = (CvCARTHaarClassifier*) icvCreateCARTHaarClassifier( numsplits );
icvInitCARTHaarClassifier( classifier, cart, haarFeatures );
num_splits += classifier->count;
cart->release( (CvClassifier**) &cart );
if( symmetric && (seq->total % 2) )
{
float normfactor = 0.0F;
CvStumpClassifier* stump;
for( i = 0; i < classifier->count; i++ )
{
if( classifier->feature[i].desc[0] == 'h' )
{
for( j = 0; j < CV_HAAR_FEATURE_MAX &&
classifier->feature[i].rect[j].weight != 0.0F; j++ )
{
classifier->feature[i].rect[j].r.x = data->winsize.width -
classifier->feature[i].rect[j].r.x -
classifier->feature[i].rect[j].r.width;
}
}
else
{
int tmp = 0;
for( j = 0; j < CV_HAAR_FEATURE_MAX &&
classifier->feature[i].rect[j].weight != 0.0F; j++ )
{
classifier->feature[i].rect[j].r.x = data->winsize.width -
classifier->feature[i].rect[j].r.x;
CV_SWAP( classifier->feature[i].rect[j].r.width,
classifier->feature[i].rect[j].r.height, tmp );
}
}
}
icvConvertToFastHaarFeature( classifier->feature,
classifier->fastfeature,
classifier->count, data->winsize.width + 1 );
stumpTrainParams.getTrainData = NULL;
stumpTrainParams.numcomp = 1;
stumpTrainParams.userdata = NULL;
stumpTrainParams.sortedIdx = NULL;
for( i = 0; i < classifier->count; i++ )
{
for( j = 0; j < numtrimmed; j++ )
{
idx = icvGetIdxAt( trimmedIdx, j );
eval.data.fl[idx] = cvEvalFastHaarFeature( &classifier->fastfeature[i],
(sum_type*) (data->sum.data.ptr + idx * data->sum.step),
(sum_type*) (data->tilted.data.ptr + idx * data->tilted.step) );
normfactor = data->normfactor.data.fl[idx];
eval.data.fl[idx] = ( normfactor == 0.0F )
? 0.0F : (eval.data.fl[idx] / normfactor);
}
stump = (CvStumpClassifier*) trainParams.stumpConstructor( &eval,
CV_COL_SAMPLE,
weakTrainVals, 0, 0, 0, trimmedIdx,
&(data->weights),
trainParams.stumpTrainParams );
classifier->threshold[i] = stump->threshold;
if( classifier->left[i] <= 0 )
{
classifier->val[-classifier->left[i]] = stump->left;
}
if( classifier->right[i] <= 0 )
{
classifier->val[-classifier->right[i]] = stump->right;
}
stump->release( (CvClassifier**) &stump );
}
stumpTrainParams.getTrainData = icvGetTrainingDataCallback;
stumpTrainParams.numcomp = n;
stumpTrainParams.userdata = &userdata;
stumpTrainParams.sortedIdx = data->idxcache;
#ifdef CV_VERBOSE
v_flipped = 1;
#endif /* CV_VERBOSE */
}
if( trimmedIdx != sampleIdx )
{
cvReleaseMat( &trimmedIdx );
trimmedIdx = NULL;
}
for( i = 0; i < numsamples; i++ )
{
idx = icvGetIdxAt( sampleIdx, i );
eval.data.fl[idx] = classifier->eval( (CvIntHaarClassifier*) classifier,
(sum_type*) (data->sum.data.ptr + idx * data->sum.step),
(sum_type*) (data->tilted.data.ptr + idx * data->tilted.step),
data->normfactor.data.fl[idx] );
}
alpha = cvBoostNextWeakClassifier( &eval, &data->cls, weakTrainVals,
&data->weights, trainer );
sumalpha += alpha;
for( i = 0; i <= classifier->count; i++ )
{
if( boosttype == CV_RABCLASS )
{
classifier->val[i] = cvLogRatio( classifier->val[i] );
}
classifier->val[i] *= alpha;
}
cvSeqPush( seq, (void*) &classifier );
numpos = 0;
for( i = 0; i < numsamples; i++ )
{
idx = icvGetIdxAt( sampleIdx, i );
if( data->cls.data.fl[idx] == 1.0F )
{
eval.data.fl[numpos] = 0.0F;
for( j = 0; j < seq->total; j++ )
{
classifier = *((CvCARTHaarClassifier**) cvGetSeqElem( seq, j ));
eval.data.fl[numpos] += classifier->eval((CvIntHaarClassifier*) classifier,
(sum_type*) (data->sum.data.ptr + idx * data->sum.step),
(sum_type*) (data->tilted.data.ptr + idx * data->tilted.step),
data->normfactor.data.fl[idx] );
}
numpos++;
}
}
icvSort_32f( eval.data.fl, numpos, 0 );
threshold = eval.data.fl[(int) ((1.0F - minhitrate) * numpos)];
numneg = 0;
numfalse = 0;
for( i = 0; i < numsamples; i++ )
{
idx = icvGetIdxAt( sampleIdx, i );
if( data->cls.data.fl[idx] == 0.0F )
{
numneg++;
sum_stage = 0.0F;
for( j = 0; j < seq->total; j++ )
{
classifier = *((CvCARTHaarClassifier**) cvGetSeqElem( seq, j ));
sum_stage += classifier->eval( (CvIntHaarClassifier*) classifier,
(sum_type*) (data->sum.data.ptr + idx * data->sum.step),
(sum_type*) (data->tilted.data.ptr + idx * data->tilted.step),
data->normfactor.data.fl[idx] );
}
if( sum_stage >= (threshold - CV_THRESHOLD_EPS) )
{
numfalse++;
}
}
}
falsealarm = ((float) numfalse) / ((float) numneg);
#ifdef CV_VERBOSE
{
float v_hitrate = 0.0F;
float v_falsealarm = 0.0F;
float v_experr = 0.0F;
for( i = 0; i < numsamples; i++ )
{
idx = icvGetIdxAt( sampleIdx, i );
sum_stage = 0.0F;
for( j = 0; j < seq->total; j++ )
{
classifier = *((CvCARTHaarClassifier**) cvGetSeqElem( seq, j ));
sum_stage += classifier->eval( (CvIntHaarClassifier*) classifier,
(sum_type*) (data->sum.data.ptr + idx * data->sum.step),
(sum_type*) (data->tilted.data.ptr + idx * data->tilted.step),
data->normfactor.data.fl[idx] );
}
if( sum_stage >= (threshold - CV_THRESHOLD_EPS) )
{
if( data->cls.data.fl[idx] == 1.0F )
{
v_hitrate += 1.0F;
}
else
{
v_falsealarm += 1.0F;
}
}
if( ( sum_stage >= 0.0F ) != (data->cls.data.fl[idx] == 1.0F) )
{
v_experr += 1.0F;
}
}
v_experr /= numsamples;
printf( "|%4d|%3d%%|%c|%9f|%9f|%9f|%9f|\n",
seq->total, v_wt, ( (v_flipped) ? '+' : '-' ),
threshold, v_hitrate / numpos, v_falsealarm / numneg,
v_experr );
printf( "+----+----+-+---------+---------+---------+---------+\n" );
fflush( stdout );
}
#endif /* CV_VERBOSE */
} while( falsealarm > maxfalsealarm && (!maxsplits || (num_splits < maxsplits) ) );
cvBoostEndTraining( &trainer );
if( falsealarm > maxfalsealarm )
{
stage = NULL;
}
else
{
stage = (CvStageHaarClassifier*) icvCreateStageHaarClassifier( seq->total, threshold );
cvCvtSeqToArray( seq, (CvArr*) stage->classifier );
}
cvReleaseMemStorage( &storage );
cvReleaseMat( &weakTrainVals );
cvFree( &(eval.data.ptr) );
return (CvIntHaarClassifier*) stage;
}
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
- 99
- 100
- 101
- 102
- 103
- 104
- 105
- 106
- 107
- 108
- 109
- 110
- 111
- 112
- 113
- 114
- 115
- 116
- 117
- 118
- 119
- 120
- 121
- 122
- 123
- 124
- 125
- 126
- 127
- 128
- 129
- 130
- 131
- 132
- 133
- 134
- 135
- 136
- 137
- 138
- 139
- 140
- 141
- 142
- 143
- 144
- 145
- 146
- 147
- 148
- 149
- 150
- 151
- 152
- 153
- 154
- 155
- 156
- 157
- 158
- 159
- 160
- 161
- 162
- 163
- 164
- 165
- 166
- 167
- 168
- 169
- 170
- 171
- 172
- 173
- 174
- 175
- 176
- 177
- 178
- 179
- 180
- 181
- 182
- 183
- 184
- 185
- 186
- 187
- 188
- 189
- 190
- 191
- 192
- 193
- 194
- 195
- 196
- 197
- 198
- 199
- 200
- 201
- 202
- 203
- 204
- 205
- 206
- 207
- 208
- 209
- 210
- 211
- 212
- 213
- 214
- 215
- 216
- 217
- 218
- 219
- 220
- 221
- 222
- 223
- 224
- 225
- 226
- 227
- 228
- 229
- 230
- 231
- 232
- 233
- 234
- 235
- 236
- 237
- 238
- 239
- 240
- 241
- 242
- 243
- 244
- 245
- 246
- 247
- 248
- 249
- 250
- 251
- 252
- 253
- 254
- 255
- 256
- 257
- 258
- 259
- 260
- 261
- 262
- 263
- 264
- 265
- 266
- 267
- 268
- 269
- 270
- 271
- 272
- 273
- 274
- 275
- 276
- 277
- 278
- 279
- 280
- 281
- 282
- 283
- 284
- 285
- 286
- 287
- 288
- 289
- 290
- 291
- 292
- 293
- 294
- 295
- 296
- 297
- 298
- 299
- 300
- 301
- 302
- 303
- 304
- 305
- 306
- 307
- 308
- 309
- 310
- 311
- 312
- 313
- 314
- 315
- 316
- 317
- 318
- 319
- 320
- 321
- 322
- 323
- 324
- 325
- 326
- 327
- 328
- 329
- 330
- 331
- 332
- 333
- 334
- 335
- 336
- 337
- 338
- 339
- 340
- 341
- 342
- 343
- 344
- 345
- 346
- 347
- 348
- 349
- 350
- 351
- 352
- 353
- 354
- 355
- 356
- 357
- 358
- 359
- 360
- 361
- 362
- 363
- 364
- 365
- 366
- 367
- 368
- 369
- 370
- 371
- 372
- 373
- 374
- 375
- 376
- 377
- 378
- 379
- 380
- 381
- 382
- 383
- 384
- 385
- 386
- 387
- 388
- 389
- 390
- 391
- 392
- 393
- 394
- 395
- 396
- 397
- 398
- 399
- 400
- 401
- 402
- 403
- 404
- 405
- 406
- 407
- 408
- 409
- 410
- 411
- 412
- 413
- 414
- 415
- 416
- 417
- 418
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
- 99
- 100
- 101
- 102
- 103
- 104
- 105
- 106
- 107
- 108
- 109
- 110
- 111
- 112
- 113
- 114
- 115
- 116
- 117
- 118
- 119
- 120
- 121
- 122
- 123
- 124
- 125
- 126
- 127
- 128
- 129
- 130
- 131
- 132
- 133
- 134
- 135
- 136
- 137
- 138
- 139
- 140
- 141
- 142
- 143
- 144
- 145
- 146
- 147
- 148
- 149
- 150
- 151
- 152
- 153
- 154
- 155
- 156
- 157
- 158
- 159
- 160
- 161
- 162
- 163
- 164
- 165
- 166
- 167
- 168
- 169
- 170
- 171
- 172
- 173
- 174
- 175
- 176
- 177
- 178
- 179
- 180
- 181
- 182
- 183
- 184
- 185
- 186
- 187
- 188
- 189
- 190
- 191
- 192
- 193
- 194
- 195
- 196
- 197
- 198
- 199
- 200
- 201
- 202
- 203
- 204
- 205
- 206
- 207
- 208
- 209
- 210
- 211
- 212
- 213
- 214
- 215
- 216
- 217
- 218
- 219
- 220
- 221
- 222
- 223
- 224
- 225
- 226
- 227
- 228
- 229
- 230
- 231
- 232
- 233
- 234
- 235
- 236
- 237
- 238
- 239
- 240
- 241
- 242
- 243
- 244
- 245
- 246
- 247
- 248
- 249
- 250
- 251
- 252
- 253
- 254
- 255
- 256
- 257
- 258
- 259
- 260
- 261
- 262
- 263
- 264
- 265
- 266
- 267
- 268
- 269
- 270
- 271
- 272
- 273
- 274
- 275
- 276
- 277
- 278
- 279
- 280
- 281
- 282
- 283
- 284
- 285
- 286
- 287
- 288
- 289
- 290
- 291
- 292
- 293
- 294
- 295
- 296
- 297
- 298
- 299
- 300
- 301
- 302
- 303
- 304
- 305
- 306
- 307
- 308
- 309
- 310
- 311
- 312
- 313
- 314
- 315
- 316
- 317
- 318
- 319
- 320
- 321
- 322
- 323
- 324
- 325
- 326
- 327
- 328
- 329
- 330
- 331
- 332
- 333
- 334
- 335
- 336
- 337
- 338
- 339
- 340
- 341
- 342
- 343
- 344
- 345
- 346
- 347
- 348
- 349
- 350
- 351
- 352
- 353
- 354
- 355
- 356
- 357
- 358
- 359
- 360
- 361
- 362
- 363
- 364
- 365
- 366
- 367
- 368
- 369
- 370
- 371
- 372
- 373
- 374
- 375
- 376
- 377
- 378
- 379
- 380
- 381
- 382
- 383
- 384
- 385
- 386
- 387
- 388
- 389
- 390
- 391
- 392
- 393
- 394
- 395
- 396
- 397
- 398
- 399
- 400
- 401
- 402
- 403
- 404
- 405
- 406
- 407
- 408
- 409
- 410
- 411
- 412
- 413
- 414
- 415
- 416
- 417
- 418
感谢:::http://www.cnblogs.com/harvey888/p/5505511.html