集成剪枝分类算法的Bagging与Adaboost示例

本文介绍了两种集成学习方法——Bagging和Adaboost在剪枝分类算法上的应用。Bagging通过从原始样本集中抽取多个子集,训练多个分类器并取平均值以增强分类效果。Adaboost则通过动态调整样本权重,逐步强化那些错误率较高的样本,以构建强分类器。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Bagging (Bootstrap Aggregation)

Pruning Classification is one of the simplest classification algorithms. It works just like if-then. However, when aggregating a lot of prunnings we are able to create a powerful classifier.

The process of Bagging based on pruning is really simple but not trivial:

  1. For j=1,…,bj=1,\dots,bj=1,,b,
    2. Pick up mmm samples from a sample set with nnn samples { (xi,yi)}i=1n\{ (x_{i},y_{i}) \}_{i=1}^{n}{(xi,yi)}i=1n. Repeating is permitted. Then we get a new sample set.
    3. Train the pruning classifier ψj\psi_{j}ψj with the new sample set.
  2. For all of the pruning classifiers { ψj}j=1b\{\psi_{j}\}_{j=1}^{b}{ ψj}j=1b, calculate their average and get fff: f(x)←1b∑j=1bψj(x)f(x)\leftarrow \frac{1}{b}\sum_{j=1}^{b}\psi_{j}(x)f(x)b1j=1bψj(x)
n=50; x=randn(n,2); 
y=2*(x(:,1)>x(:,2))-1;
b=5000; a=50; Y=zeros(a,a);
X0=linspace(-3,3,a);
[X(:,:,1), X(:,:,
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值