[1] |
李界家, 杨志宇, 曹阳. 铁水硅含量的集成模糊神经网络预测方法[J]. 计算机与应用化学, 2013, 30(10):1113-1116. LI J, YANG Z Y, CAO Y. Research on hot metal silicon content prediction based on integrated neural network[J]. Computers and Applied Chemistry, 2013, 30 (10):1113-1116.
|
[2] |
TAHASHI H, KAWAI H, KOBAYASHI M, et al. Two dimensional cold model study on unstable solid descending motion and control in blast furnace operation with low reducing agent rate[J]. ISIJ International, 2005, 45(10):1386-1395.
|
[3] |
CHU M S, YANG X F, SHEN F M. Numerical simulation of innovative operation of blast furnace based on multi-fluid model[J]. Journal of Iron and Steel Research, International, 2006, 13(6):8-15.
|
[4] |
赵敏. 高炉冶炼过程的复杂机理及其预测研究[D]. 杭州:浙江大学, 2008. ZHAO M. Complexity mechanism and predictive research for BF ironmaking process[D]. Hangzhou:Zhejiang University, 2008.
|
[5] |
NOGAMI H, CHU M S, YAGI J. Multi-dimensional transient mathematical simulator of blast furnace process based on multi-fluid and kinetic theories[J]. Computers & Chemical Engineering, 2005, 29(11):2438-2448.
|
[6] |
郜传厚, 渐令, 陈积明, 等. 复杂高炉炼铁过程的数据驱动建模及预测算法[J]. 自动化学报, 2009, 35(6):725-730. GAO C H, JIAN L, CHEN J M, et al. Data-driven modeling and predictive algorithm for complex blast furnace iron -making process[J]. Acta Automatica Sinica, 2009, 35(6):725-730.
|
[7] |
SAXEN H. Short-term prediction of silicon content in pig iron[J].Canadian Metallurgical Quarterly, 1994, 33(4):319-326.
|
[8] |
LUO S H, GAO C H, ZENG J S, et al. Blast furnace system modeling by multivariate phase space reconstruction and neural networks[J]. Asian Journal of Control, 2013, 15(2):553. 561.
|
[9] |
安剑奇, 陈易斐, 吴敏. 基于改进支持向量机的高炉一氧化碳利用率预测方法[J]. 化工学报, 2015, 66(1):206-214. AN J Q, CHEN Y F, WU M. A prediction method for carbon monoxide utilization ratio of blast furnace based on improved support vector regression[J]. CIESC Journal, 2015, 66(1):206-214.
|
[10] |
曾燕飞, 李小伟. 基于BP神经网络的高炉铁水硅含量预测模型研究[J]. 控制与测量, 2006, 22(19):291-293. ZENG Y F, LI X W. Model prediction model of blast furnace hot metal Si-content based on BP neural network's study[J]. Control and Measurement, 2006, 22(19):291-293.
|
[11] |
宋菁华, 杨春节, 周哲, 等. 改进型EMD-Elman神经网络在铁水硅含量预测中的应用[J]. 化工学报, 2016, 67(3):729-735. SONG J H, YANG C J, ZHOU Z, et al. Application of improved EMD-Elman neural network to predict silicon content in hot metal[J]. CIESC Journal, 2016, 67(3):729-735.
|
[12] |
蒋朝辉, 董梦林, 桂卫华, 等. 基于Bootstrap的高炉铁水硅含量二维预报[J]. 自动化学报, 2016, 42(5):715-723. JIANG Z H, DONG M L, GUI W H, et al. Two-dimensional prediction for silicon content of hot metal of blast furnace based on Bootstrap[J]. Acta Automatica Sinica, 2016, 42(5):715-723.
|
[13] |
宋贺达, 周平, 王宏, 等. 高炉炼铁过程多元铁水质量非线性子空间建模及应用[J]. 自动化学报, 2016, 42(21):1664-1679. SONG H D, ZHOU P, WANG H, et al. Non-linear subspace modeling of multivariate molten iron qualityin blast furnace ironmaking and its application[J]. Acta Automatica Sinica, 2016, 42(21):1664-1679.
|
[14] |
ZHOU H, YANG C J, LIU W H, et al. A sliding-window T-S fuzzy neural network model for prediction of silicon content in hot metal[C]//20th IFAC World Congress. IFAC-PapersOnLine, 2017, 50(1):14988-14991.
|
[15] |
SENIOR A. Context dependent phone models for LSTM RNN acoustic modeling[J]. IEEE Int. Conf. Acoust, Speech Signal Process, 2015, (1):4585-4589.
|
[16] |
LIU C J, WANG Y Q, KSHITIZ K, et al. Investigations on speaker adaptation of LSTM-RNN models for speech recognition[C]//IEEE International Conference on ICASSP, 2016:5020-5024.
|
[17] |
GRAVES A, JAITLY N, MOHAMED A. Hybrid speech recognition with deep bi-directional LSTM[C]//Proceedings of the IEEE Workshop on Automatic Speech Recognition and Understanding, 2013.
|
[18] |
SAK H, SENIOR A, BEAUFAYS F. Long short-term memory recurrent neural network architectures for large scale acoustic modeling[C]//Annual Conference of the International Speech Communication Association (Interspeech), 2014:338-342.
|
[19] |
ZACHARY C LIPTON. Learning to diagnose with LSTM recurrent neural networks[C]//ICLR 2016. Computer Science, 2017.
|
[20] |
GRAVES A, SCHMIDHUBER J. Frame wise phoneme classification with bidirectional LSTM and other neural network architectures[J]. Neural Networks, 2005, 18(5/6):602-610.
|
[21] |
SONG E, KANG H G. Multi-class learning algorithm for deep neural network-based statistical parametric speech synthesis[C]//Proc. 24th Eur. Signal Process Conf., 2016, 1951-1955.
|
[22] |
ACHANTA S, GODAMBE T, GANGASHETTY S V. An investigation of recurrent neural network architectures for statistical parametric speech synthesis[C]//Proc. Interspeech, 2015:859-863.
|
[23] |
ZEN H, SAK H. Unidirectional long short-term memory recurrent neural network with recurrent output layer for low-latency speech synthesis[C]//IEEE Speech Signal Process, 2015:4470-4474.
|
[24] |
BYEON W, LIWICKI M, BREUEL T. Texture classification using 2D LSTM networks[C]//Pattern Recognition International Conference, 2014:1144-1149.
|
[25] |
BYEON W, BREUEL T M. Supervised texture segmentation using 2D LSTM networks[C]//Processing 2014 IEEE International Conference, 2014:4373-4377.
|
[26] |
PINHEIRO P, COLLOBERT R, JEBARAT, et al. Recurrent convolutional neural networks for scene labeling[C]//Proceedings of the 31 st International Conference on Machine Learning (ICML-14), 2014:82-90.
|
[27] |
SOCHER R, HUVAL B, BATH B, et al. Convolutional recursive deep learning for 3D object classification[C]//Advances in Neural Information Processing Systems, 2012:665-673.
|
[28] |
GONZALEZ-DOMINGUEZ J, LOPEZ-MORENO I, MORENO P J, et al. Frame by frame language identification in short utterances using deep neural networks[J]. Neural Networks, 2015, 64(C):49-58.
|
[29] |
GERS F A, SCHMIDHUBER J, CUMMINS F. Learning to forget:continual prediction with LSTM[J]. Neural Computation, 2000, 12(10):2451-2471.
|
[30] |
HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8):1735-1780.
|
[31] |
BENGIO Y, SIMARD P, FRASCONI P. Learning long-term dependencies with gradient descent is difficult[J]. IEEE Transactions on Neural Networks, 1994, 5(2):157-166.
|
[32] |
GREFF K, SRIVASTAVA R K, KOUTNÍK J, et al. LSTM:a search space Odyssey[J]. IEEE Transactions on Neural Networks & Learning Systems, 2015, 28 (10):2222.
|
[33] |
GERS F A, SCHRAUDOLPH N N, SCHMIDHUBER J. Learning precise timing with LSTM recurrent networks[J]. Journal of Machine Learning Research, 2003, 3(1):115-143.
|