SVR forecasts stock opening price

本文深入探讨了支持向量回归(SVR)的概念及其在解决回归问题中的应用。SVR作为支持向量分类的延伸,仅依赖于训练数据的子集进行模型构建,通过忽略模型预测附近的训练数据来简化成本函数。文章还介绍了三种SVR实现:SVR、NuSVR和LinearSVR,并通过示例展示了如何使用sklearn库进行SVR模型的训练和预测。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

SVM-Regression
        The method of Support Vector Classification can be extended to solve regression problems. This method is called Support Vector Regression.
        The model produced by support vector classification (as described above) depends only on a subset of the training data, because the cost function for building the model does not care about training points that lie beyond the margin. Analogously, the model produced by Support Vector Regression depends only on a subset of the training data, because the cost function for building the model ignores any training data close to the model prediction.
        There are three different implementations of Support Vector Regression: SVR, NuSVR and LinearSVR. LinearSVR provides a faster implementation than SVR but only considers linear kernels, while NuSVR implements a slightly different formulation than SVR and LinearSVR.
        As with classification classes, the fit method will take as argument vectors X, y, only that in this case y is expected to have floating point values instead of integer values:

> ```
>>>> from sklearn import svm
>>>> X = [[0, 0], [2, 2]]
>>>> y = [0.5, 2.5]
>>>> clf = svm.SVR()
>>>> clf.fit(X, y) 
SVR(C=1.0, cache_size=200, coef0=0.0, degree=3, epsilon=0.1, gamma='auto',
    kernel='rbf', max_iter=-1, shrinking=True, tol=0.001, verbose=False)
>>>> clf.predict([[1, 1]])
array([ 1.5])
> ```

Support Vector Regression (SVR) using linear and non-linear kernels:

> ```
> import numpy as np
> from sklearn.svm import SVR
> import matplotlib.pyplot as plt

> ###############################################################################
> # Generate sample data
> X = np.sort(5 * np.random.rand(40, 1), axis=0)
> y = np.sin(X).ravel()

> ###############################################################################
> # Add noise to targets
> y[::5] += 3 * (0.5 - np.random.rand(8))

> ###############################################################################
> # Fit regression model
> svr_rbf = SVR(kernel='rbf', C=1e3, gamma=0.1)
> svr_lin = SVR(kernel='linear', C=1e3)
> svr_poly = SVR(kernel='poly', C=1e3, degree=2)
> y_rbf = svr_rbf.fit(X, y).predict(X)
> y_lin = svr_lin.fit(X, y).predict(X)
> y_poly = svr_poly.fit(X, y).predict(X)
>
> ###############################################################################
> # look at the results
> plt.scatter(X, y, c='k', label='data')
> plt.plot(X, y_rbf, c='g', label='RBF model')
> plt.plot(X, y_lin, c='r', label='Linear model')
> plt.plot(X, y_poly, c='b', label='Polynomial model')
> plt.xlabel('data')
> plt.ylabel('target')
> plt.title('Support Vector Regression')
> plt.legend()
> plt.show()
> ```

 

References:
A Tutorial on Support Vector Regression” Alex J. Smola, Bernhard Schölkopf -Statistics and Computing archive Volume 14 Issue 3, August 2004, p. 199-222

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值