理解SVM的各个参数是什么

本文详细介绍了SVM的三种类型:C-SVC、ν-SVC和One-Class SVM,以及它们在分类和回归问题中的应用。同时,讨论了SVM的四种核函数,包括线性、多项式、径向基函数(RBF)和Sigmoid,并提供了每种核函数的数学表达式。此外,文章还讲解了SVM参数如GAMMA、COEF、DEGREE、C、P和NU的含义及影响,帮助读者理解如何调整这些参数以优化模型性能。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

SVM Types

分类
C_SVC=100:
C-Support Vector Classification. n-class classification (n ≥ \geq 2), allows imperfect separation of classes with penalty multiplier C for outliers.

NU_SVC=101:
ν \nu ν-Support Vector Classification. n-class classification with possible imperfect separation. Parameter ν \nu ν (in the range 0…1, the larger the value, the smoother the decision boundary) is used instead of C.

单类界限
ONE_CLASS=102:
Distribution Estimation (One-class %SVM). All the training data are from the same class, %SVM builds a boundary that separates the class from the rest of the feature space.

回归
EPS_SVR=103:
ϵ \epsilon ϵ-Support Vector Regression. The distance between feature vectors from the training set and the fitting hyper-plane must be less than p. For outliers the penalty multiplier C is used.

NU_SVR=104:
ν \nu ν-Support Vector Regression. ν \nu ν is used instead of p.

SVM KernelType

CUSTOM=-1:
Returned by SVM::getKernelType in case when custom kernel has been set

LINEAR=0:
Linear kernel. No mapping is done, linear discrimination (or regression) is done in the original feature space. It is the fastest option. K ( x i , x j ) = x i T x j K(x_i, x_j) = x_i^T x_j K(xi,xj)=xiTxj.

POLY=1:
Polynomial kernel: K ( x i , x j ) = ( γ x i T x j + c o e f 0 ) d e g

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值