[1]
Liu Q S, Dang C Y, Huang T W. A one-layer recurrent neural network for real-time portfolio optimization with probability criterion. IEEE Transactions on Cybernetics, 2013, 43(1): 14-23
[2]
Lin Y Y, Chang J Y, Lin C T. Identification and prediction of dynamic systems using an interactively recurrent self-evolving fuzzy neural network. IEEE Transactions on Neural Networks and Learning Systems, 2013, 24(2): 310-321
[3]
Lian J, Wang J. Passivity of switched recurrent neural networks with time-varying delays. IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(2): 357-366
[4]
吴玉香, 王聪.基于确定学习的机器人任务空间自适应神经网络控制.自动化学报, 2013, 39(6): 806-815
Wu Yu-Xiang, Wang Cong. Deterministic learning based adaptive network control of robot in task space. Acta Automatica Sinica, 2013, 39(6): 806-815
[5]
Chandrasekar A, Rakkiyappan R, Cao J D, Lakshmanan S. Synchronization of memristor-based recurrent neural networks with two delay components based on second-order reciprocally convex approach. Neural Networks, 2014, 57: 79-93
[6]
Alhamdoosh M, Wang D H. Fast decorrelated neural network ensembles with random weights. Information Sciences, 2014, 264: 104-117
[7]
Lee Y, Oh S H, Kim M W. An analysis of premature saturation in back propagation learning. Neural Networks, 1993, 6(5): 719-728
[8]
Burse K, Yadav R N, Shrivastava S C. Channel equalization using neural networks: A review. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 2010, 40(3): 352-357
[9]
Pfeifer R, Lungarella M, Iida F. Self-organization, embodiment, and biologically inspired robotics. Science, 2007, 318(5853): 1088-1093
[10]
Schmidhuber J. Deep learning in neural networks: An overview. Neural Networks, 2015, 61: 85-117
[11]
Kriegeskorte N. Deep neural networks: A new framework for modeling biological vision and brain information processing. Annual Review of Vision Science, 2015, 1: 417-446
[12]
Hinton G E, Salakhutdinov R R. Reducing the dimensionality of data with neural networks. Science, 2006, 313(5786): 504-507
[13]
LeCun Y, Bengio Y, Hinton G. Deep learning. Nature, 2015, 521(7553): 436-444
[14]
Wang G, Qiao J, Bi J, Jia Q, Zhou M. An Adaptive Deep Belief Network With Sparse Restricted Boltzmann Machines. IEEE Transactions on Neural Networks and Learning Systems, 2020, 31(10): 4217-4228
[15]
乔俊飞, 王功明, 李晓理, 韩红桂, 柴伟.基于自适应学习率的深度信念网设计与应用.自动化学报, 2017, 43(8): 1339-1349
Qiao Jun-Fei, Wang Gong-Ming, Li Xiao-Li, Han Hong-Gui, Chai Wei. Design and application of deep belief network with adaptive learning rate. Acta Automatica Sinica, 2017, 43(8): 1339-1349
[16]
Wang G, Jia Q, Qiao J, Bi J, Liu C. A sparse deep belief network with efficient fuzzy learning framework. Neural Networks, 2020, 121: 430-440
[17]
Baldi P, Sadowski P, Whiteson D. Searching for exotic particles in high-energy physics with deep learning. Nature Communications, 2014, 5: 4308
[18]
Lv Y S, Duan Y J, Kang W W, Li Z X, Wang F Y. Traffic flow prediction with big data: A deep learning approach. IEEE Transactions on Intelligent Transportation Systems, 2015, 16(2): 865-873
[19]
Chan T H, Jia K, Gao S H, Lu J W, Zeng Z, Ma Y. PCANet: A simple deep learning baseline for image classification? IEEE Transactions on Image Processing, 2015, 24(12): 5017-5032
[20]
Hinton G E, Osindero S, Teh Y W. A fast learning algorithm for deep belief nets. Neural Computation, 2006, 18(7): 1527-1554
[21]
Sutskever I, Hinton G E. Deep, narrow sigmoid belief networks are universal approximators. Neural Computation, 2008, 20(11): 2629-2636
[22]
Qin Y, Wang X, Zou J Q. The optimized deep belief networks with improved logistic Sigmoid units and their application in fault diagnosis for planetary gearboxes of wind turbines. IEEE Transactions on Industrial Electronics, 2019, 66(5): 3814-3824
[23]
Qiao J F, Wang G M, Li W J, Chen M. An adaptive deep Q-learning strategy for handwritten digit recognition. Neural Networks, 2018, 107: 61-71
[24]
Abdel-Zaher A M, Eldeib A M. Breast cancer classification using deep belief networks. Expert Systems with Applications, 2016, 46: 139-144
[25]
Qiao J F, Wang G M, Li W J, Li X L. A deep belief network with PLSR for nonlinear system modeling. Neural Networks, 2018, 104: 68-79
[26]
Qiao J F, Wang G M, Li X L, Li W J. A self-organizing deep belief network for nonlinear system modeling. Applied Soft Computing, 2018, 65: 170-183
[27]
Wang G M, Qiao J F, Bi J, Li W J, Zhou M C. TL-GDBN: Growing deep belief network with transfer learning. IEEE Transactions on Automation Science and Engineering, 2019, 16(2): 874-885
[28]
Chen Z Y, Li W H. Multisensor feature fusion for bearing fault diagnosis using sparse autoencoder and deep belief network. IEEE Transactions on Instrumentation and Measurement, 2017, 66(7): 1693-1702
[29]
Ranzato M A, Boureau Y L, LeCun Y. Sparse feature learning for deep belief networks. In: Proceedings of the 20th International Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada: Curran Associates, Inc., 2018. 1185-1192
[30]
Ichimura T, Kamada S. Adaptive learning method of recurrent temporal deep belief network to analyze time series data. In: Proceedings of the 2017 International Joint Conference on Neural Networks. Anchorage, AK, USA: IEEE, 2017. 2346-2353
[31]
Hinton G E. Training products of experts by minimizing contrastive divergence. Neural Computation, 2002, 14(8): 1771-1800
[32]
王功明, 乔俊飞, 王磊.一种能量函数意义下的生成式对抗网络.自动化学报, 2018, 44(5): 793-803
Wang Gong-Ming, Qiao Jun-Fei, Wang Lei. A generative adversarial network based on energy function. Acta Automatica Sinica, 2018, 44(5): 793-803
[33]
Goodfellow I J, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, et al. Generative adversarial nets. In: Proceedings of the 27th International Conference on Neural Information Processing Systems. Montreal, Canada: NIPS, 2014. 2672-2680
[34]
Schirrmeister R T, Springenberg J T, Fiederer L D J, Glasstetter M, Eggensperger K, Tangermann M, et al. Deep learning with convolutional neural networks for EEG decoding and visualization. Human Brain Mapping, 2017, 38(11): 5391-5420
[35]
Nguyen A T, Xu J, Luu D K, Zhao Q, Yang Z. Advancing system performance with redundancy: From biological to artificial designs. Neural Computation, 2019, 31(3): 555-573
[36]
Bengio Y. Learning deep architectures for AI. Foundations and Trends® in Machine Learning, 2009, 2(1): 1-127
[37]
Glorot X, Bordes A, Bengio Y. Deep sparse rectifier neural networks. In: Proceedings of the 14th International Conference on Artificial Intelligence and Statistics. Fort Lauderdale, USA: JMLR.org, 2011. 315-323
[38]
Ali M B. Use of Dropouts and Sparsity for Regularization of Autoencoders in Deep Neural Networks.[Master dissertation], Bilkent University, Bilkent, 2015
[39]
Wright J, Yang A Y, Ganesh A, Sastry S, Ma Y. Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009, 31(2): 210-227
[40]
Lee H, Ekanadham C, Ng A Y. Sparse deep belief net model for visual area V2. In: Proceedings of the 20th International Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada: Curran Associates, Inc., 2007. 873-880
[41]
Keyvanrad M A, Homayounpour M M. Normal sparse deep belief network. In: Proceedings of the 2015 International Joint Conference on Neural Networks. Killarney, Ireland: IEEE, 2015. 1-7
[42]
Lian R J. Adaptive self-organizing fuzzy sliding-mode radial basis-function neural-network controller for robotic systems. IEEE Transactions on Industrial Electronics, 2014, 61(3): 1493-1503
[43]
Li F J, Qiao J F, Han H G, Yang C L. A self-organizing cascade neural network with random weights for nonlinear system modeling. Applied Soft Computing, 2016, 42: 184-193
[44]
Sarinnapakorn K, Kubat M. Combining subclassifiers in text categorization: A DST-based solution and a case study. IEEE Transactions on Knowledge and Data Engineering, 2007, 19(2): 1638-1651
[45]
Van Opbroek A, Achterberg H C, Vernooij M W, De Bruijne M. Transfer learning for image segmentation by combining image weighting and kernel learning. IEEE Transactions on Medical Imaging, 2019, 38(1): 213-224
[46]
Shin H C, Roth H R, Gao M C, Lu L, Xu Z Y, Nogues I, et al. Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Transactions on Medical Imaging, 2016, 35(5): 1285-1298
[47]
Long M S, Wang J M, Ding G G, Pan S J, Yu P S. Adaptation regularization: A general framework for transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2014, 26(5): 1076-1089
[48]
Afridi M J, Ross A, Shapiro E M. On automated source selection for transfer learning in convolutional neural networks. Pattern Recognition, 2018, 73: 65-75
[49]
Taylor M E, Stone P. Transfer learning for reinforcement learning domains: A survey. The Journal of Machine Learning Research, 2009, 10: 1633-1685
[50]
Lu J, Behbood V, Hao P, Zuo H, Xue S, Zhang G Q. Transfer learning using computational intelligence: A survey. Knowledge-Based Systems, 2015, 80: 14-23
[51]
Shao L, Zhu F, Li X L. Transfer learning for visual categorization: A survey. IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(5): 1019-1034
[52]
Sutskever I, Hinton G E, Taylor G W. The recurrent temporal restricted Boltzmann machine. In: Proceedings of the 21st International Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada: Curran Associates, Inc., 2008. 1601-1608
[53]
Fischer A, Igel C. An introduction to restricted Boltzmann machines. In: Proceedings of the 17th Iberoamerican Congress on Pattern Recognition. Buenos Aires, Argentina: Springer, 2012. 14-36
[54]
Srivastava N, Salakhutdinov R R. Multimodal learning with deep Boltzmann machines. In: Proceedings of the 25th International Conference on Neural Information Processing Systems. Lake Tahoe, Nevada, USA: NIPS, 2012. 2222-2230
[55]
Fischer A, Igel C. Training restricted Boltzmann machines: An introduction. Pattern Recognition, 2014, 47(1): 25-39
[56]
Boulanger-Lewandowski N, Bengio Y, Vincent P. Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. In: Proceedings of the 29th International Conference on Machine Learning. Edinburgh, Scotland, UK: Icml.cc/Omnipress, 2012. 1881-1888
[57]
Hermans M, Schrauwen B. Training and analyzing deep recurrent neural networks. In: Proceedings of the 26th International Conference on Neural Information Processing Systems. Lake Tahoe, Nevada, USA: NIPS, 2013. 190-198
[58]
Chaturvedi I, Ong Y S, Tsang I W, Welsch R E, Cambria E. Learning word dependencies in text by means of a deep recurrent belief network. Knowledge-Based Systems, 2016, 108: 144-154
[59]
Pascanu R, Ģülçehre C, Cho K, Bengio Y. How to construct deep recurrent neural networks. In: Proceedings of the 2nd International Conference on Learning Representations. Banff, AB, Canada: ICLR, 2014.
[60]
Mohamed A R, Dahl G E, Hinton G E. Acoustic modeling using deep belief networks. IEEE Transactions on Audio, Speech, and Language Processing, 2012, 20(1): 14-22
[61]
Wang G M, Qiao J F, Li X L, Wang L, Qian X L. Improved classification with semi-supervised deep belief network. IFAC-PapersOnLine, 2017, 50(1): 4174-4179
[62]
Lopes N, Ribeiro B. Improving convergence of restricted Boltzmann machines via a learning adaptive step size. In: Proceedings of the 17th Iberoamerican Congress on Pattern Recognition. Buenos Aires, Argentina: Springer, 2012. 511-518
[63]
Raina R, Madhavan A, Ng A Y. Large-scale deep unsupervised learning using graphics processors. In: Proceedings of the 26th Annual International Conference on Machine Learning. Montreal, Quebec, Canada: ACM, 2009. 873-880
[64]
Sierra-Sosa D, Garcia-Zapirain B, Castillo C, Oleagordia I, Nu?o-Solinis R, Urtaran-Laresgoiti M, Elmaghraby A. Scalable Healthcare Assessment for Diabetic Patients Using Deep Learning on Multiple GPUs. IEEE Transactions on Industrial Informatics, 2019, 15(10): 5682-5689
[65]
Lopes N, Ribeiro B. Towards adaptive learning with improved convergence of deep belief networks on graphics processing units. Pattern recognition, 2014, 47(1): 114-127
[66]
王功明, 李文静, 乔俊飞.基于PLSR自适应深度信念网络的出水总磷预测.化工学报, 2017, 68(5): 1987-1997
Wang Gong-Ming, Li Wen-Jing, Qiao Jun-Fei. Prediction of effluent total phosphorus using PLSR-based adaptive deep belief network. CIESC Journal, 2017, 68(5): 1987-1997
[67]
Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373-1396
[68]
Chapelle O, Weston J, Schölkopf B. Cluster kernels for semi-supervised learning. In: Proceedings of the 15th International Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada: MIT Press, 2003. 601-608
[69]
Larochelle H, Bengio Y. Classification using discriminative restricted Boltzmann machines. In: Proceedings of the 25th International Conference on Machine Learning. Helsinki, Finland: ACM, 2008. 536-543
[70]
Lasserre J A, Bishop C M, Minka T P. Principled hybrids of generative and discriminative models. In: Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. New York, NY, USA: IEEE, 2006. 87-94
[71]
Larochelle H, Erhan D, Bengio Y. Zero-data learning of new tasks. In: Proceedings of the 23rd AAAI Conference on Artificial Intelligence. Chicago, Illinois, USA: AAAI Press, 2008. 646-651
[72]
Sun X C, Li T, Li Q, Huang Y, Li Y Q. Deep belief echo-state network and its application to time series prediction. Knowledge-Based Systems, 2017, 130: 17-29
[73]
Deng Y, Ren Z Q, Kong Y Y, Bao F, Dai Q H. A hierarchical fused fuzzy deep neural network for data classification. IEEE Transactions on Fuzzy Systems, 2017, 25(4): 1006-1012
[74]
Janik L J, Forrester S T, Rawson A. The prediction of soil chemical and physical properties from mid-infrared spectroscopy and combined partial least-squares regression and neural networks (PLS-NN) analysis. Chemometrics and Intelligent Laboratory Systems, 2009, 97(2): 179-188
[75]
He Y L, Geng Z Q, Xu Y, Zhu Q X. A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement. ISA Transactions, 2015, 58: 533-542
[76]
Furber S B, Lester D R, Plana L A, Garside J D, Painkras E, Temple S, et al. Overview of the spinnaker system architecture. IEEE Transactions on Computers, 2013, 62(12): 2454-2467
[77]
Erhan D, Bengio Y, Courville A, Manzagol P A, Vincent P, Bengio S. Why does unsupervised pre-training help deep learning? The Journal of Machine Learning Research, 2010, 11: 625-660
[78]
Angermueller C, PĠrnamaa T, Parts L, Stegle O. Deep learning for computational biology. Molecular Systems Biology, 2016, 12(7): 878
[79]
Min S, Lee B, Yoon S. Deep learning in bioinformatics. Briefings in Bioinformatics, 2017, 18(5): 851-869
[80]
Gharehbaghi A, Lindén M. A Deep Machine Learning Method for Classifying Cyclic Time Series of Biological Signals Using Time-Growing Neural Network. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(9): 4102-4115
[81]
Denil M, Shakibi B, Dinh L, Ranzato M, de Freitas N. Predicting parameters in deep learning. In: Proceedings of the 26th International Conference on Neural Information Processing Systems. Lake Tahoe, Nevada, USA: NIPS, 2013. 2148-2156
[82]
Lenz I, Knepper R, Saxena A. DeepMPC: Learning deep latent features for model predictive control. In: Proceedings of the Robotics: Science and Systems XI. Rome, Italy: 2015.