Bagging (Bootstrap Aggregation)
Pruning Classification is one of the simplest classification algorithms. It works just like if-then. However, when aggregating a lot of prunnings we are able to create a powerful classifier.
The process of Bagging based on pruning is really simple but not trivial:
- For j=1,…,bj=1,\dots,bj=1,…,b,
2. Pick up mmm samples from a sample set with nnn samples { (xi,yi)}i=1n\{ (x_{i},y_{i}) \}_{i=1}^{n}{(xi,yi)}i=1n. Repeating is permitted. Then we get a new sample set.
3. Train the pruning classifier ψj\psi_{j}ψj with the new sample set. - For all of the pruning classifiers { ψj}j=1b\{\psi_{j}\}_{j=1}^{b}{ ψj}j=1b, calculate their average and get fff: f(x)←1b∑j=1bψj(x)f(x)\leftarrow \frac{1}{b}\sum_{j=1}^{b}\psi_{j}(x)f(x)←b1j=1∑bψj(x)
n=50; x=randn(n,2);
y=2*(x(:,1)>x(:,2))-1;
b=5000; a=50; Y=zeros(a,a);
X0=linspace(-3,3,a);
[X(:,:,1), X(:,:,