Section I: Brief Introduction on LDA
Linear Discriminat Analysis (LDA) can be used as a technique for feature extraction to increase the computational efficiency and reduce the degree of overfitting due to the curse of dimensionality in non-regularized model. The general concept behind LDA is very similar to PCA. Whereas PCA attempts to find the orthogonal component axes of maximum variance in a dataset, the goal in LDA is to find the feature subspace that optimizes class separability. In contrast with PCA, LDA is a supervised algorithm.
Personal Views
LDA是一种监督学习算法,其在于利用类别标签来计算类间距离和类内部距离矩阵后,以类似于PCA算法计算两者矩阵整体的特征值和特征向量。因此,从某种角度,LDA和PCA是略微相似的。
FROM
Sebastian Raschka, Vahid Mirjalili. Python机器学习第二版. 南京:东南大学出版社,2018.
Section II: Code Bundle
代码:
import matplotlib.pyplot as plt
from sklearn import datasets
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split
import numpy as np
from sklearn