Section I: Brief Introduction on SVM
Another powerful and widely used learning algorithm is the Support Vector Machine (SVM), which can be considered an extension of the perceptron. Using the perceptron algortihm, misclassification errors is our target to be optimized. However, for SVM, our optimization objective is to maximize the margin. The margin is defined as the distance between the separating hyperplance (decision boundary) and the training samples that are closest to this hyperplance, which are the so-called support vectors.
From
Sebastian Raschka, Vahid Mirjalili. Python机器学习第二版. 南京:东南大学出版社,2018.
Section II: Apply it to Linearly-Sperable Scenario
import matplotlib.pyplot as plt
from sklearn import datasets
import numpy as np
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.svm import SVC
from SupportVectorMachine.visualize_test_idx import plot_decision_regions
plt.rcParams['figure.dpi']=200
plt.rcParams['savefig.dpi']=200
font = {
'family': 'Times New Roman',
'weight': 'light'}
plt.rc("font", **font)
##Section 1: Load data and split it into train/test dataset
iris=datasets.load_iris()
X=iris.data[:,[2,3]]
y&#