参考文献/References:
[1]PENG Y, DONG M, ZUO M J. Current status of machine prognostics in condition-based maintenance: a review[J]. The International Journal of Advanced Manufacturing Technology, 2010,50(1), 297-313. doi:10.1007/s00170-009-2482-0.
[2]LU CY, MIN H, GUI J, et al. Face recognition via weighted sparse representation[J]. Journal of Visual Communication and Image Representation, 2013,24(2): 111-116.
[3]HUANG P S, HE X, GAO J, et al. Learning deep structured semantic models for web search using clickthrough data[C]// Proceedings of the 22nd ACM International Conference on Conference on Information & Knowledge Management. [S.l.: s.n.],2013.
[4]KOUROU K, EXARCHOS T P, EXARCHOS K P, et al. Machine learning applications in cancer prognosis and prediction[J]. Computational and Structural Biotechnology Journal, 2015,13:8-17.
[5]SILVER D, HUANG A, MADDISON C J, et al. Mastering the game of go with deep neural networks and tree search[J]. Nature, 2016,529(7587): 484-489.
[6]LECUN Y, BENGIO Y, HINTON G. Deep learning[J]. Nature, 2015 521(7553):436-444.
[7]GAN M, WANG C. Construction of hierarchical diagnosis network based on deep learning and its application in the fault pattern recognition of rolling element bearings[J]. Mechanical Systems and Signal Processing, 2016, 72:92-104.
[8]GOODFELLOW I, BENGIO Y, COURVILLE A. Deep learning[M]. [S.l.]: MIT Press,2016.
[9]MCCARTHY J, MINSKY M L, ROCHESTER N, et al. A proposal for the dartmouth summer research project on artificial intelligence[J]. AI Magazine, 2006,27(4):12.
[10]CANTU-ORTIZ F J. Advancing artificial intelligence research and dissemination through conference series: Benchmark, scientific impact and the MICAI experience[J]. Expert Systems with Applications, 2014,41(3):781-785.
[11]MICHALSKI R, KODRATOFF Y. Research in machine learning. Machine Learning III[J]. An Artificial Intelligence Approach, 1990(2):3-30.
[12]PALM R B. Prediction as a candidate for learning deep hierarchical models of data[J]. Technical University of Denmark, 2012(3):5.
[13]HOCHREITER S. Untersuchungen zu dynamischen neuronalen netzen diploma[J].Technische Universit?t München, 1991(2): 91.
[14]SCHMIDHUBER J.Deep learning in neural networks: an overview[J]. Neural Networks, 2015,61: 85-117.
[15]HOCHREITER S, SCHMIDHUBER J. Bridging long time lags by weight guessing and “Long short-term Memory” [J]. Spatiotemporal Models in Biological and Artificial Systems, 1996,37:65-72.
[16]DENG L. Three classes of deep learning architectures and their applications: a tutorial survey[C]//APSIPA Transactions on Signal and Information Processing.[S.l.:s.n.]:2013.
[17]WANG K S. Applied computational intelligence in intelligent manufacturing systems[M].[S.l.]: Advanced Knowledge International Pty,2005.
[18]ZHANG Z. Data mining approaches for intelligent condition-based maintenance: a framework of intelligent fault diagnosis and prognosis system(IFDPS)[M].[S.l.]: NTNU-Trykk,2014.
[19]KRAUSS C, DO X A, HUCK N. Deep neural networks, gradient-boosted trees, random forests: statistical arbitrage on the S&P 500[J]. European Journal of Operational Research, 2017,259(2):689-702.
[20]NIELSEN M A. Neural networks and deep learning[M].[S.l.]: Determination Press,2015.
[21]SUTSKEVER I, MARTENS J, DAHL G, et al. On the importance of initialization and momentum in deep learning[C]//The International Conference on Machine Learning. [S.l.:s.n.],2013.
[22]RECHT B, RE C, WRIGHT S, et al. Hogwild: a lock-free approach to parallelizing stochastic gradient descent[C]// Paper presented at the Advances in Neural Information Processing Systems. [S.l.:s.n.], 2011.
[23]GLOROT X, BENGIO Y. Understanding the difficulty of training deep feedforward neural networks[C]// Paper presented at the Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. [S.l.:s.n.],2010.
[24]CANDEL A, PARMAR V, LEDELL E, et al. Deep learning with h2o: H2O[M]. [S.l.:s.n.], 2015.
[25]LECUN Y A, BOTTOU L, ORR G B, et al. Efficient backprop[C]// Neural Networks: Tricks of the Trade. [S.l.]:Springer, 2012: 9-48.
[26]GOODFELLOW I J, WARDE-FARLEY D, MIRZA M, et al. Maxout networks[M]. [S.l.:s.n.],2013.
[27]YANG D M,STRONACH A, MACCONNELL P, et al.Third-order spectral techniques for the diagnosis of motor bearing condition using artificial neural networks[J]. Mechanical Systems and Signal Processing, 2002,16(2/3), 391-411.
[28]JAFAR R, SHAHROUR I, JURAN I. Application of artificial neural networks(ANN)to model the failure of urban water mains[J]. Mathematical and Computer Modelling, 2010,51(9):1170-1180.
[29]SUN Y J, ZHANG S, MIAO C X, et al. Improved BP neural network for transformer fault diagnosis[J]. Journal of China University of Mining and Technology, 2007,17(1):138-142.
[30]DIN M U, MARNERIDES A K.Short term power load forecasting using Deep Neural Networks[C]//2017 International Conference on. [S.l.]:The Computing, Networking and Communications(ICNC), 2017.
[31]WANG L, ZHANG Z, LONG H, et al. Wind turbine gearbox failure identification with deep neural networks[J]. IEEE Transactions on Industrial Informatics, 2017, 13(3): 1360-1368.
[32]VINCENT P, LAROCHELLE H, BENGIO Y, et al. Extracting and composing robust features with denoising autoencoders[C]// The Proceedings of the 25th International Conference on Machine Learning. [S.l.:s.n.],2008.
[33]BENGIO Y, LAMBLIN P, POPOVICI D, et al. Greedy layer-wise training of deep networks[C]// The Advances in Neural Information Processing Systems. [S.l.:s.n.],2007.
[34]POULTNEY C, CHOPRA S, CUN Y L. Efficient learning of sparse representations with an energy-based model[C]// The Advances in Neural Information Processing Systems. [S.l.:s.n.],2007.
[35]BENGIO Y, COURVILLE A, VINCENT P. Representation learning: a review and new perspectives[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013,35(8):1798-1828.
[36]ERHAN D, BENGIO Y, COURVILLE A, et al.Why does unsupervised pre-training help deep learning? [J]. Journal of Machine Learning Research, 2010,11(2):625-660.
[37]GALLOWAY G S, CATTERSON V M, FAY T, et al. Diagnosis of tidal turbine vibration data through deep neural networks[C]// Proceedings of the Third European Conference of the Prognostics and Health Management Society.[S.l.]:PHM Society, 2016:172-180.
[38]SHIN H C, ORTON M R, COLLINS D J, et al. Stacked autoencoders for unsupervised feature learning and multiple organ detection in a pilot study using 4D patient data[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35(8):1930-1943.
[39]ZABALZA J, REN J, ZHENG J, et al.Novel segmented stacked autoencoder for effective dimensionality reduction and feature extraction in hyperspectral imaging[J]. Neurocomputing, 2016,185: 1-10.
[40]HINTON G E, SALAKHUTDINOV R R. Reducing the dimensionality of data with neural networks[J]. Science, 2006,313(5786):504-507.
[41]JIA F, LEI Y, LIN J, et al. Deep neural networks: a promising tool for fault characteristic mining and intelligent diagnosis of rotating machinery with massive data[J].Mechanical Systems and Signal Processing, 2016,72:303-315.
[42]KEYVANRAD M A, HOMAYOUNPOUR M M.A brief survey on deep belief networks and introducing a new object oriented MATLAB toolbox(DeeBNet V2. 1)[C]. [S.l.]: ArXiv Preprint ArXiv, 2014.
[43]YU D, DENG L. Deep learning and its applications to signal and information processing [exploratory dsp] [J]. IEEE Signal Processing Magazine, 2011,28(1):145-154.
[44]HINTON G. A practical guide to training restricted boltzmann machines[J]. Momentum, 2010,9(1):926.
[45]HINTON G E. Training products of experts by minimizing contrastive divergence[J]. Neural Computation, 2012,14(8):1771-1800.
[46]CARREIRA-PERPINAN M A, HINTON G. On Contrastive Divergence Learning[C]// The AISTATS. [S.l.:s.n.],2005.
[47]HINTON G, DENG L, YU D, et al. Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups[J]. IEEE Signal Processing Magazine, 2012,29(6): 82-97.
[48]BENGIO Y. Learning deep architectures for AI[J]. Foundations and trends in Machine Learning, 2009,2(1):1-127.
[49]O’CONNOR P, NEIL D, LIU S C, et al. Real-time classification and sensor fusion with a spiking deep belief network[J]. Frontiers in Neuroscience, 2013, 7:178.
[50]TAMILSELVAN P, WANG P. Failure diagnosis using deep belief learning based health state classification[J]. Reliability Engineering & System Safety, 2013,115:124-135.
[51]KUREMOTO T, KIMURA S, KOBAYASHI K, et al. Time series forecasting using a deep belief network with restricted Boltzmann machines[J]. Neurocomputing, 2014,137:47-56.
[52]HUANG W, SONG G, HONG H,et al. Deep architecture for traffic flow prediction: deep belief networks with multitask learning[J]. IEEE Transactions on Intelligent Transportation Systems, 2014,15(5):2191-2201.
[53]RIBEIRO B, LOPES N. Deep belief networks for financial prediction[C]// The Neural Information Processing. [S.l.:s.n.],2011.
[54]CHEN J, JIN Q, CHAO J. Design of deep belief networks for short-term prediction of drought index using data in the Huaihe river basin[J]. Mathematical Problems in Engineering, 2012(2):1-16.
[55]MA Q, TANIGAWA I, MURATA M. Retrieval term prediction using deep belief networks[C]// The PACLIC. [S.l.:s.n.],2014.
[56]HOCHREITER S, SCHMIDHUBER J. Long short-term Memory[J]. Neural Computation, 1997,9(8):1735-1780. doi:10.1162/neco.1997.9.8.1735.
[57]ZHAO R, YAN R, CHEN Z, et al. Deep learning and its applications to machine health monitoring: a survey[M].[S.l.]: ArXiv,2016.
[58]GREFF K, SRIVASTAVA R K, KOUTNíK J, et al. LSTM: a search space odyssey[J]. IEEE Transactions on Neural Networks and Learning Systems,2016(10):1-12.
[59]ZHAO R, YAN R, WANG J, et al. Learning to monitor machine health with convolutional bi-directional lstm networks[J]. Sensors, 2017,17(2): 273.
[60]LIAO L, AHN H I. Combining deep learning and survival analysis for asset health management[J]. International Journal of Prognostics and Health Management, 2016,16(20):1-12.
[61]SAK H, SENIOR A, BEAUFAYS F. Long short-term Memory recurrent neural network architectures for large scale acoustic modeling[C]//The Fifteenth Annual Conference of the International Speech Communication Association. [S.l.:s.n.],2014.
[62]DE BRUIN T, VERBERT K, BABUKA R. Railway track circuit fault diagnosis using recurrent neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2017,28(3):523-533.
[63]GRAVES A, MOHAMED A R, HINTON G. Speech recognition with deep recurrent neural networks[C]//The Acoustics, Speech and Signal Processing(icassp). 2013 IEEE International Conference on. [S.l.:s.n.],2013.
[64]PALANGI H, DENG L, SHEN Y, et al. Deep sentence embedding using long short-term Memory networks: analysis and application to information retrieval[J]. IEEE/ACM Transactions on Audio, Speech and Language Processing(TASLP), 2016,24(4):694-707.
[65]HANSON J, YANG Y, PALIWAL K,et al. Improving protein disorder prediction by deep bidirectional long short-term Memory recurrent neural networks[J]. Bioinformatics, 2016,33(5):685-692.
[66]SAK H, SENIOR A W. Processing acoustic sequences using long short-term Memory(lstm)neural networks that include recurrent projection layers: Google patents[C]. [S.l.:s.n.],2017.
[67]ZHAO R, WANG J, YAN R, et al. Machine health monitoring with LSTM networks[C]// The Sensing Technology(ICST), 2016 10th International Conference on. [S.l.:s.n.],2016.
[68]MALHOTRA P, TV V, RAMAKRISHNAN A, et al. Multi-sensor prognostics using an unsupervised health index based on lstm encoder-decoder[J]. ResearchGate,2016(8):1-9.
[69]CIRESAN D, GIUSTI A, GAMBARDELLA L M, et al. Deep neural networks segment neuronal membranes in electron microscopy images[C]// Advances in Neural Information Processing Systems. [S.l.:s.n.],2012.
[70]DEAN J, CORRADO G, MONGA R, et al. Large scale distributed deep networks[C]// Advances in Neural Information Processing Systems. [S.l.:s.n.],2012.
[71]LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998,86(11): 2278-2324.
[72]JANSSENS O, SLAVKOVIKJ V, VERVISCH B, et al. Convolutional neural network based fault detection for rotating machinery[J]. Journal of Sound and Vibration, 2016,377:331-345.
[73]CHEN Z Q, LI C, SANCHEZ R V. Gearbox fault identification and classification with convolutional neural networks[J]. Shock and Vibration,2015,2015:1-10.
[74]BABU G S, ZHAO P, LI X L. Deep convolutional neural network based regression approach for estimation of remaining useful life[C]// International Conference on Database Systems for Advanced Applications. [S.l.:s.n.],2016.