1
Liu
S, Motani
M. Feature selection based on unique relevant information for health data [EB/OL]. (2018-12-02). https://arXiv.org/abs/1812.00415.
2
Davis
J C, Sampson
R J. Machine learning feature selection methods for landslide susceptibility mapping[J]. Mathematical Geosciences,2013, 46(1): 33-57.
3
Li
J, Hu
X, Wu
L, et al. Robust unsupervised feature selection on networked data [C] // ICDM. [S.l.]: SIAM,2016: 387-395.
4
Wang
S, Tang
J, Liu
H. Embedded unsupervised feature selection [C] // AAAI. [S.l.]: AAAI Press,2015: 471-476.
5
Guyon
I, Gunn
S, Nikravesh
M, et al. Feature extraction: Foundations and applications [M]. [S.l.]: Springer, 2008: 1-22.
6
Chandrashekar
G, Sahin
F. A survey on feature selection methods[J]. Computers & Electrical Engineering,2014, 40(1): 16-28.
7
Miao
J, Niu
L.A survey on feature selection [J]. Procedia Computer Science,2016(91): 919-926.
8
Luo
M, Nie
F, Chang
X, et al. Adaptive unsupervised feature selection with structure regularization [J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(4): 944-956.
9
He
X, Cai
D, Niyogi
P. Soft-constrained Laplacian score for semi-supervised multi-label feature selection [J]. Knowledge and Information Systems, 2016, 47(1): 75-98.
10
Brown
G, Pocock
A, Zhao
M, et al. Conditional likelihood maximisation: A unifying framework for information theoretic feature selection [J]. Journal of Machine Learning Research,2012, 13(1): 27-66.
11
Liu
H,Motoda
H. Computational methods of feature selection [M]. [S.l.]: CRC Press, 2007: 147-165.
12
Yu
L, Liu
H. Efficient feature selection via analysis of relevance and redundancy [J]. Journal of Machine Learning Research, 2004, 5: 1205-1224.
13
Zhu
J, Rosset
S, Tibshirani
R, et al. ℓ-norm support vector machines [C]// NIPS.[S.l.]: MIT Press, 2004: 49-56.
14
Tibshirani
R. Regression shrinkage and selection via the lasso: a retrospective [J]. Journal of the Royal Statistical Society (Series B), 2011,73(3): 273-282.
15
Bach
F R. Consistency of the group and multiple kernel learning [J]. Journal of Machine Learning Research, 2008, 9: 1179-1225.
16
Shi
L, Du
L, Shen
Y D. Robust spectral learning for unsupervised feature selection [C] // ICDM. [S.l.]: SIAM, 2014: 977-982.
17
Kabir
M, Shahjahan
M, Murase
K. New local search based hybrid genetic algorithm for feature selection [J]. Neurocomputing, 2011(74): 2914-2928.
18
Cai
D, Zhang
C, He
X. Unsupervised feature selection for multi-cluster data [C] // KDD. New York: ACM, 2010: 333-342.
19
Das
K, Samanta
S, Pal
M. Study on centrality measures in social networks: A survey [EB/OL]. (2018-02-28). Social Network Analysis and Mining, https://link.springer.com/article/10.1007/s13278-018-0493-2.
20
Lazar
C, Taminau
J, Meganck
S, et al. A survey on filter techniques for feature selection in gene expression microarray analysis [J]. IEEE/ACM Trans Comput Biol Bioinform, 2012, 9(4): 1106-1119.
21
Li
J, Cheng
K, Wang
S, et al. Feature selection: A data perspective [EB/OL]. (2016-01-29). https://arXiv.org/abs/1601.07996.
22
Zhao
Z, Liu
H. Spectral feature selection for supervised and unsupervised learning [C] // ICML. New York: ACM, 2007: 1151-1157.
23
Zhu
P, Zhu
W, Hu
Q. Subspace clustering guided unsupervised feature selection[J]. Pattern Recognition, 2017(66): 364-374.
24
Yang
Y, Shen
H T, Ma
Z, et al. ℓ2,1-norm regularized discriminative feature selection for unsupervised learning[C] // IJCAI. [S.l.]: AAAI Press, 2011: 1589-1594.
25
Li
Z, Yang
Y, Liu
J, et al. Unsupervised feature selection using nonnegative spectral analysis [C] //AAAI. [S.l.]: AAAI Press, 2012: 1026-1032.
26
Zañudo
G T J, Yang
G, Albert
R. Structural control of nonlinear complex networks [J]. Proceedings of the National Academy of Sciences, 2017, 114(28): 7234-7239.