Section I: Brief Introduction on PCA
PCA helps us to identify patterns in data based on the correlation between features. In a nutshell, PCA aims to find the directions of maximum variance in high-dimensional data and projects it onto a new subspace with equal or fewer dimensions than the original one. The orthogonal axes (principal components) of new subspace can be interpreted as the directions of maximum variance given the constaint that the new feature aes are orthogonal to each other.
FROM
Sebastian Raschka, Vahid Mirjalili. Python机器学习第二版. 南京:东南大学出版社,2018.
Section II: Code Boundle
import matplotlib.pyplot as plt
from sklearn import datasets
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split
import numpy as np
from sklearn.linear_model import LogisticRegression
from sklearn.decomposition import PCA
from PCA.visualize import plot_decision_regions
#Section 1: Prepare data
plt.rcParams['figure.dpi']=200