回归模型与房价预测

本文使用sklearn库中的波士顿房价数据集,通过线性回归和多项式回归模型预测房价。首先加载数据并使用pandas进行展示,接着利用线性回归模型拟合数据,绘制散点图和预测线。最后,通过多项式特征转换增强模型,并再次拟合数据,显示改进后的预测结果。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

import pandas as pd
pd.DataFrame(boston.data)

 

from sklearn.datasets import load_boston
boston=load_boston()
boston.keys()

dict_keys(['data', 'target', 'feature_names', 'DESCR'])

0123456789101112
00.0063218.02.310.00.5386.57565.24.09001.0296.015.3396.904.98
10.027310.07.070.00.4696.42178.94.96712.0242.017.8396.909.14
20.027290.07.070.00.4697.18561.14.96712.0242.017.8392.834.03
30.032370.02.180.00.4586.99845.86.06223.0222.018.7394.632.94
40.069050.02.180.00.4587.14754.26.06223.0222.018.7396.905.33
50.029850.02.180.00.4586.43058.76.06223.0222.018.7394.125.21
60.0882912.57.870.00.5246.01266.65.56055.0311.015.2395.6012.43
70.1445512.57.870.00.5246.17296.15.95055.0311.015.2396.9019.15
80.2112412.57.870.00.5245.631100.06.08215.0311.015.2386.6329.93
90.1700412.57.870.00.5246.00485.96.59215.0311.015.2386.7117.10
100.2248912.57.870.00.5246.37794.36.34675.0311.015.2392.5220.45
110.1174712.57.870.00.5246.00982.96.22675.0311.015.2396.9013.27
120.0937812.57.870.00.5245.88939.05.45095.0311.015.2390.5015.71
130.629760.08.140.00.5385.94961.84.70754.0307.021.0396.908.26
140.637960.08.140.00.5386.09684.54.46194.0307.021.0380.0210.26
150.627390.08.140.00.5385.83456.54.49864.0307.021.0395.628.47
161.053930.08.140.00.5385.93529.34.49864.0307.021.0386.856.58
170.784200.08.140.00.5385.99081.74.25794.0307.021.0386.7514.67
180.802710.08.140.00.5385.45636.63.79654.0307.021.0288.9911.69
190.725800.08.140.00.5385.72769.53.79654.0307.021.0390.9511.28
201.251790.08.140.00.5385.57098.13.79794.0307.021.0376.5721.02
210.852040.08.140.00.5385.96589.24.01234.0307.021.0392.5313.83
221.232470.08.140.00.5386.14291.73.97694.0307.021.0396.9018.72
230.988430.08.140.00.5385.813100.04.09524.0307.021.0394.5419.88
240.750260.08.140.00.5385.92494.14.39964.0307.021.0394.3316.30
250.840540.08.140.00.5385.59985.74.45464.0307.021.0303.4216.51
260.671910.08.140.00.5385.81390.34.68204.0307.021.0376.8814.81
270.955770.08.140.00.5386.04788.84.45344.0307.021.0306.3817.28
280.772990.08.140.00.5386.49594.44.45474.0307.021.0387.9412.80
291.002450.08.140.00.5386.67487.34.23904.0307.021.0380.2311.98
..........................................
4764.871410.018.100.00.6146.48493.62.305324.0666.020.2396.2118.68
47715.023400.018.100.00.6145.30497.32.100724.0666.020.2349.4824.91
47810.233000.018.100.00.6146.18596.72.170524.0666.020.2379.7018.03
47914.333700.018.100.00.6146.22988.01.951224.0666.020.2383.3213.11
4805.824010.018.100.00.5326.24264.73.424224.0666.020.2396.9010.74
4815.708180.018.100.00.5326.75074.93.331724.0666.020.2393.077.74
4825.731160.018.100.00.5327.06177.03.410624.0666.020.2395.287.01
4832.818380.018.100.00.5325.76240.34.098324.0666.020.2392.9210.42
4842.378570.018.100.00.5835.87141.93.724024.0666.020.2370.7313.34
4853.673670.018.100.00.5836.31251.93.991724.0666.020.2388.6210.58
4865.691750.018.100.00.5836.11479.83.545924.0666.020.2392.6814.98
4874.835670.018.100.00.5835.90553.23.152324.0666.020.2388.2211.45
4880.150860.027.740.00.6095.45492.71.82094.0711.020.1395.0918.06
4890.183370.027.740.00.6095.41498.31.75544.0711.020.1344.0523.97
4900.207460.027.740.00.6095.09398.01.82264.0711.020.1318.4329.68
4910.105740.027.740.00.6095.98398.81.86814.0711.020.1390.1118.07
4920.111320.027.740.00.6095.98383.52.10994.0711.020.1396.9013.35
4930.173310.09.690.00.5855.70754.02.38176.0391.019.2396.9012.01
4940.279570.09.690.00.5855.92642.62.38176.0391.019.2396.9013.59
4950.178990.09.690.00.5855.67028.82.79866.0391.019.2393.2917.60
4960.289600.09.690.00.5855.39072.92.79866.0391.019.2396.9021.14
4970.268380.09.690.00.5855.79470.62.89276.0391.019.2396.9014.10
4980.239120.09.690.00.5856.01965.32.40916.0391.019.2396.9012.92
4990.177830.09.690.00.5855.56973.52.39996.0391.019.2395.7715.10
5000.224380.09.690.00.5856.02779.72.49826.0391.019.2396.9014.33
5010.062630.011.930.00.5736.59369.12.47861.0273.021.0391.999.67
5020.045270.011.930.00.5736.12076.72.28751.0273.021.0396.909.08
5030.060760.011.930.00.5736.97691.02.16751.0273.021.0396.905.64
5040.109590.011.930.00.5736.79489.32.38891.0273.021.0393.456.48
5050.047410.011.930.00.5736.03080.82.50501.0273.021.0396.907.88

506 rows × 13 columns

data=boston.data
x=data[:,6]
y=boston.target

from sklearn.linear_model import LinearRegression
LineR=LinearRegression()
LineR.fit(x.reshape(-1,1),y)
w=LineR.coef_
b=LineR.intercept_
print(w,b)

import matplotlib.pyplot as pl
pl.scatter(x,y)
pl.plot(x,w*x+b,'g')
pl.show()
 
[-0.12316272] 30.97867776261804
 
 

 

 

xx=data[:,6].reshape(-1,1)
pl.scatter(xx,y)
pl.show()

from sklearn.preprocessing import PolynomialFeatures
p=PolynomialFeatures()
p.fit(xx)
p.transform(xx)

 
Out[51]:
array([[1.00000e+00, 6.52000e+01, 4.25104e+03],
       [1.00000e+00, 7.89000e+01, 6.22521e+03],
       [1.00000e+00, 6.11000e+01, 3.73321e+03],
       ...,
       [1.00000e+00, 9.10000e+01, 8.28100e+03],
       [1.00000e+00, 8.93000e+01, 7.97449e+03],
       [1.00000e+00, 8.08000e+01, 6.52864e+03]])
x_poly=p.transform(xx)
x_poly

lrp=LinearRegression()
lrp.fit(x_poly,y)
y_poly=lrp.predict(x_poly)
pl.scatter(xx,y)
pl.plot(xx,w*xx+b,'r')
pl.scatter(xx,y_poly)
pl.show()
lrp.coef_

 

 

array([ 0.        ,  0.06919309, -0.00159822])

转载于:https://www.cnblogs.com/lbjdaxiong/p/10095547.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值