决策树是既可以做分类又可以做回归的模型,这篇文章我们来测试一下它的回归的效果如何。
首先,我们来生成一些数据:
import numpy as np
import matplotlib as mpl
from sklearn.linear_model import RidgeCV
from sklearn.ensemble import BaggingRegressor
from sklearn.tree import DecisionTreeRegressor
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import PolynomialFeatures
import matplotlib.pyplot as plt
mpl.rcParams['font.sans-serif'] = ['SimHei']
mpl.rcParams['axes.unicode_minus'] = False
def f(x):
return 0.5 * np.exp(-(x+3) **2) + np.exp(-x**2) + 0.5 * np.exp(-(x-3) ** 2)
np.random.seed(0)
N = 200
x = np.random.rand(N) * 10 - 5 # [-5,5)
x = np.sort(x)
y = f(x) + 0.05 * np.random.randn(N)
x.shape = -1, 1
x_test = np.linspace(x.min() - 0.5, x.max() + 0.5, 1000)
plt.figure(figsize=(12, 8), facecolor='w')
plt.plot(x, y, 'ro', label='训练数据')
plt.plot(x_test, f(x_test), color='k', lw=3.5, label='真实值')
plt.grid(True)
plt.show()
画出其图像来看一下数据点: