CIESC Journal ›› 2022, Vol. 73 ›› Issue (1): 342-351.DOI: 10.11949/0438-1157.20211104
• Process system engineering • Previous Articles Next Articles
Zhuang YUAN1(),Yiqun LING2,Zhe YANG1(),Chuankun LI1
Received:
2021-08-09
Revised:
2021-10-20
Online:
2022-01-18
Published:
2022-01-05
Contact:
Zhe YANG
通讯作者:
杨哲
作者简介:
袁壮(1991—),男,博士,工程师,基金资助:
CLC Number:
Zhuang YUAN, Yiqun LING, Zhe YANG, Chuankun LI. Critical parameters prediction based on TA-ConvBiLSTM for chemical process[J]. CIESC Journal, 2022, 73(1): 342-351.
袁壮, 凌逸群, 杨哲, 李传坤. 基于TA-ConvBiLSTM的化工过程关键工艺参数预测[J]. 化工学报, 2022, 73(1): 342-351.
Add to citation manager EndNote|Ris|BibTeX
符号 | 变量 | 单位 | 位号 | XGBoost重要度 |
---|---|---|---|---|
x1 | 炉管局部温度 | ℃ | y2jTI2012A | 539 |
x2 | 加热炉进料流量Ⅰ | t/h | y2jFI2007A | 205 |
x3 | 加热炉进口温度 | ℃ | y2jTI2003 | 178 |
x4 | 加热炉进料压力Ⅰ | MPa | y2jPI2013B | 82 |
x5 | 加热炉进料压力Ⅱ | MPa | y2jPI2013A | 68 |
x6 | 加热炉进料流量Ⅱ | t/h | y2jFI2007C | 26 |
x7 | 加热炉进料压力Ⅲ | MPa | y2jPI2014C | 24 |
x8 | 加热炉进料压力Ⅳ | MPa | y2jPI2013C | 14 |
x9 | 加热炉出口温度 | ℃ | y2jTI2009B | 13 |
x10 | 蒸汽流量 | t/h | y2jFIC2002B | 13 |
Table 1 Some candidate correlation variables and their characteristic importance
符号 | 变量 | 单位 | 位号 | XGBoost重要度 |
---|---|---|---|---|
x1 | 炉管局部温度 | ℃ | y2jTI2012A | 539 |
x2 | 加热炉进料流量Ⅰ | t/h | y2jFI2007A | 205 |
x3 | 加热炉进口温度 | ℃ | y2jTI2003 | 178 |
x4 | 加热炉进料压力Ⅰ | MPa | y2jPI2013B | 82 |
x5 | 加热炉进料压力Ⅱ | MPa | y2jPI2013A | 68 |
x6 | 加热炉进料流量Ⅱ | t/h | y2jFI2007C | 26 |
x7 | 加热炉进料压力Ⅲ | MPa | y2jPI2014C | 24 |
x8 | 加热炉进料压力Ⅳ | MPa | y2jPI2013C | 14 |
x9 | 加热炉出口温度 | ℃ | y2jTI2009B | 13 |
x10 | 蒸汽流量 | t/h | y2jFIC2002B | 13 |
序号 | 网络结构 | MAE | RMSE | R2 |
---|---|---|---|---|
1 | 0.0179 | 0.0242 | 0.975 | |
2 | 0.0271 | 0.0349 | 0.947 | |
3 | 0.0185 | 0.0269 | 0.969 | |
4 | 0.0186 | 0.258 | 0.971 | |
5 | 0.0151 | 0.0216 | 0.980 | |
6 | 0.0190 | 0.0254 | 0.972 | |
7 | 0.0196 | 0.0264 | 0.970 | |
8 | 0.0222 | 0.0302 | 0.961 | |
9 | 0.0166 | 0.0235 | 0.976 |
Table 2 Model prediction accuracy under different network structures
序号 | 网络结构 | MAE | RMSE | R2 |
---|---|---|---|---|
1 | 0.0179 | 0.0242 | 0.975 | |
2 | 0.0271 | 0.0349 | 0.947 | |
3 | 0.0185 | 0.0269 | 0.969 | |
4 | 0.0186 | 0.258 | 0.971 | |
5 | 0.0151 | 0.0216 | 0.980 | |
6 | 0.0190 | 0.0254 | 0.972 | |
7 | 0.0196 | 0.0264 | 0.970 | |
8 | 0.0222 | 0.0302 | 0.961 | |
9 | 0.0166 | 0.0235 | 0.976 |
池化策略 | 池化尺寸 | MAE | RMSE | R2 |
---|---|---|---|---|
不添加池化层 | 0.0151 | 0.0216 | 0.980 | |
max-pooling | 2 | 0.0189 | 0.0260 | 0.971 |
3 | 0.0232 | 0.0311 | 0.958 | |
4 | 0.0258 | 0.0362 | 0.943 | |
mean-pooling | 2 | 0.0209 | 0.0286 | 0.965 |
3 | 0.0271 | 0.0354 | 0.946 | |
4 | 0.0316 | 0.0431 | 0.920 |
Table 3 Model prediction accuracy under different pooling sizes
池化策略 | 池化尺寸 | MAE | RMSE | R2 |
---|---|---|---|---|
不添加池化层 | 0.0151 | 0.0216 | 0.980 | |
max-pooling | 2 | 0.0189 | 0.0260 | 0.971 |
3 | 0.0232 | 0.0311 | 0.958 | |
4 | 0.0258 | 0.0362 | 0.943 | |
mean-pooling | 2 | 0.0209 | 0.0286 | 0.965 |
3 | 0.0271 | 0.0354 | 0.946 | |
4 | 0.0316 | 0.0431 | 0.920 |
Batch size | MAE | RMSE | R2 |
---|---|---|---|
64 | 0.0179 | 0.0237 | 0.976 |
128 | 0.0151 | 0.0216 | 0.980 |
256 | 0.0180 | 0.0245 | 0.974 |
512 | 0.0186 | 0.0262 | 0.970 |
1024 | 0.0204 | 0.0278 | 0.967 |
Table 4 Model prediction accuracy under different batch sizes
Batch size | MAE | RMSE | R2 |
---|---|---|---|
64 | 0.0179 | 0.0237 | 0.976 |
128 | 0.0151 | 0.0216 | 0.980 |
256 | 0.0180 | 0.0245 | 0.974 |
512 | 0.0186 | 0.0262 | 0.970 |
1024 | 0.0204 | 0.0278 | 0.967 |
(a) Input | ||||
---|---|---|---|---|
网络类型 | b | k | p | 输出 |
输入层 | 48 | 10 | 1 | 48×10 |
(b) CNN | ||||
网络类型 | 核数量 | 核尺寸 | 比率 | 输出 |
卷积层 | 64 | 1×1 | 48×64 | |
卷积层 | 96 | 1×3 | 46×96 | |
卷积层 | 128 | 1×5 | 42×128 | |
Dropout | 0.3 | 42×128 | ||
(c) BiLSTM | ||||
网络类型 | 单元数 | 比率 | 输出 | |
BiLSTM | 128 | 42×256 | ||
BiLSTM | 256 | 42×512 | ||
Dropout | 0.2 | 42×512 | ||
(d) XGBoost | ||||
基学习器 | 学习率 | 迭代 | 惩罚项 | 最大深度 |
gbtree | 0.1 | 50 | 0 | 15 |
(e) SVR | ||||
内核 | 阶次 | 精度 | 惩罚项c | 核系数g |
rbf | 3 | 10-3 | 4 | Auto |
(f) BPNN | ||||
神经元数 | 激活 | 优化器 | 惩罚项 | 迭代 |
100 | ReLu | Adam | 10-4 | 1000 |
Table 5 Networks structure and parameters setting
(a) Input | ||||
---|---|---|---|---|
网络类型 | b | k | p | 输出 |
输入层 | 48 | 10 | 1 | 48×10 |
(b) CNN | ||||
网络类型 | 核数量 | 核尺寸 | 比率 | 输出 |
卷积层 | 64 | 1×1 | 48×64 | |
卷积层 | 96 | 1×3 | 46×96 | |
卷积层 | 128 | 1×5 | 42×128 | |
Dropout | 0.3 | 42×128 | ||
(c) BiLSTM | ||||
网络类型 | 单元数 | 比率 | 输出 | |
BiLSTM | 128 | 42×256 | ||
BiLSTM | 256 | 42×512 | ||
Dropout | 0.2 | 42×512 | ||
(d) XGBoost | ||||
基学习器 | 学习率 | 迭代 | 惩罚项 | 最大深度 |
gbtree | 0.1 | 50 | 0 | 15 |
(e) SVR | ||||
内核 | 阶次 | 精度 | 惩罚项c | 核系数g |
rbf | 3 | 10-3 | 4 | Auto |
(f) BPNN | ||||
神经元数 | 激活 | 优化器 | 惩罚项 | 迭代 |
100 | ReLu | Adam | 10-4 | 1000 |
序号 | 类别 | 模型 | MAE | RMSE | R2 |
---|---|---|---|---|---|
1 | TA机制 | TA-ConvBiLSTM | 0.0151 | 0.0216 | 0.980 |
2 | TA-BiLSTM | 0.0185 | 0.0251 | 0.973 | |
3 | 深层网络 | BiLSTM | 0.0219 | 0.0294 | 0.963 |
4 | ConvBiLSTM | 0.0240 | 0.0318 | 0.956 | |
5 | LSTM | 0.0264 | 0.0351 | 0.947 | |
6 | CNN | 0.0323 | 0.0417 | 0.925 | |
7 | 浅层网络 | SVR | 0.0374 | 0.0463 | 0.908 |
8 | XGBoost | 0.0371 | 0.0473 | 0.904 | |
9 | BPNN | 0.0449 | 0.0517 | 0.885 |
Table 6 Comparison of prediction performance of various models
序号 | 类别 | 模型 | MAE | RMSE | R2 |
---|---|---|---|---|---|
1 | TA机制 | TA-ConvBiLSTM | 0.0151 | 0.0216 | 0.980 |
2 | TA-BiLSTM | 0.0185 | 0.0251 | 0.973 | |
3 | 深层网络 | BiLSTM | 0.0219 | 0.0294 | 0.963 |
4 | ConvBiLSTM | 0.0240 | 0.0318 | 0.956 | |
5 | LSTM | 0.0264 | 0.0351 | 0.947 | |
6 | CNN | 0.0323 | 0.0417 | 0.925 | |
7 | 浅层网络 | SVR | 0.0374 | 0.0463 | 0.908 |
8 | XGBoost | 0.0371 | 0.0473 | 0.904 | |
9 | BPNN | 0.0449 | 0.0517 | 0.885 |
1 | 邵伟明, 葛志强, 李浩, 等. 基于循环神经网络的半监督动态软测量建模方法[J]. 电子测量与仪器学报, 2019, 33(11): 7-13. |
Shao W M, Ge Z Q, Li H, et al. Semisupervised dynamic soft sensing approaches based on recurrent neural network[J]. Journal of Electronic Measurement and Instrumentation, 2019, 33(11): 7-13. | |
2 | Wang K, Gopaluni R B, Chen J, et al. Deep learning of complex batch process data and its application on quality prediction[J]. IEEE Transactions on Industrial Informatics, 2020, 16(12): 7233-7242. |
3 | Yuan X F, Ou C, Wang Y L, et al. A novel semi-supervised pre-training strategy for deep networks and its application for quality variable prediction in industrial processes[J]. Chemical Engineering Science, 2020, 217: 115509. |
4 | 李泽龙, 杨春节, 刘文辉, 等. 基于LSTM-RNN模型的铁水硅含量预测[J]. 化工学报, 2018, 69(3): 992-997. |
Li Z L, Yang C J, Liu W H, et al. Research on hot metal Si-content prediction based on LSTM-RNN[J]. CIESC Journal, 2018, 69(3): 992-997. | |
5 | Yao L, Ge Z Q. Deep learning of semisupervised process data with hierarchical extreme learning machine and soft sensor application[J]. IEEE Transactions on Industrial Electronics, 2018, 65(2): 1490-1498. |
6 | Yuan X F, Huang B, Wang Y L, et al. Deep learning-based feature representation and its application for soft sensor modeling with variable-wise weighted SAE[J]. IEEE Transactions on Industrial Informatics, 2018, 14(7): 3235-3243. |
7 | Yan W W, Tang D, Lin Y J. A data-driven soft sensor modeling method based on deep learning and its application[J]. IEEE Transactions on Industrial Electronics, 2017, 64(5): 4237-4245. |
8 | Liu Y, Yang C, Gao Z L, et al. Ensemble deep kernel learning with application to quality prediction in industrial polymerization processes[J]. Chemometrics and Intelligent Laboratory Systems, 2018, 174: 15-21. |
9 | 宋菁华, 杨春节, 周哲, 等. 改进型EMD-Elman神经网络在铁水硅含量预测中的应用[J]. 化工学报, 2016, 67(3): 729-735. |
Song J H, Yang C J, Zhou Z, et al. Application of improved EMD-Elman neural network to predict silicon content in hot metal[J]. CIESC Journal, 2016, 67(3): 729-735. | |
10 | 刘佳, 邵诚, 朱理. 基于迁移学习工况划分的裂解炉收率PSO-LS-SVM建模[J]. 化工学报, 2016, 67(5): 1982-1988. |
Liu J, Shao C, Zhu L. Modeling of cracking furnace yields with PSO-LS-SVM based on operating condition classification by transfer learning[J]. CIESC Journal, 2016, 67(5): 1982-1988. | |
11 | Geng Z Q, Dong J G, Chen J, et al. A new self-organizing extreme learning machine soft sensor model and its applications in complicated chemical processes[J]. Engineering Applications of Artificial Intelligence, 2017, 62: 38-50. |
12 | 顾恒昌, 牟鹏, 李建伟. 基于交叉迭代BLSTM网络的乙烯裂解炉建模[J]. 化工学报, 2019, 70(2): 548-555. |
Gu H C, Mu P, Li J W. Modeling and application of ethylene cracking furnace based on cross-iterative BLSTM network[J]. CIESC Journal, 2019, 70(2): 548-555. | |
13 | Liu S, Liu X J, Lyu Q, et al. Comprehensive system based on a DNN and LSTM for predicting sinter composition[J]. Applied Soft Computing, 2020, 95: 106574. |
14 | Sun Q Q, Ge Z Q. Probabilistic sequential network for deep learning of complex process data and soft sensor application[J]. IEEE Transactions on Industrial Informatics, 2019, 15(5): 2700-2709. |
15 | Shao W M, Yao L, Ge Z Q, et al. Parallel computing and SGD-based DPMM for soft sensor development with large-scale semisupervised data[J]. IEEE Transactions on Industrial Electronics, 2019, 66(8): 6362-6373. |
16 | Yuan X F, Li L, Shardt Y A W, et al. Deep learning with spatiotemporal attention-based LSTM for industrial soft sensor model development[J]. IEEE Transactions on Industrial Electronics, 2021, 68(5): 4404-4414. |
17 | Yuan X F, Li L, Wang Y L. Nonlinear dynamic soft sensor modeling with supervised long short-term memory network[J]. IEEE Transactions on Industrial Informatics, 2020, 16(5): 3168-3176. |
18 | 耿志强, 徐猛, 朱群雄, 等. 基于深度学习的复杂化工过程软测量模型研究与应用[J]. 化工学报, 2019, 70(2): 564-571. |
Geng Z Q, Xu M, Zhu Q X, et al. Research and application of soft measurement model for complex chemical processes based on deep learning[J]. CIESC Journal, 2019, 70(2): 564-571. | |
19 | 温鑫, 钱玉良, 彭道刚, 等. 基于深度双向LSTM的SCR系统NOx排放预测模型研究[J]. 热能动力工程, 2020, 35(10): 57-64. |
Wen X, Qian Y L, Peng D G, et al. NOx emission prediction model of SCR system based on deep bidirectional LSTM[J]. Journal of Engineering for Thermal Energy and Power, 2020, 35(10): 57-64. | |
20 | Zhang L, Liu P, Zhao L, et al. Air quality predictions with a semi-supervised bidirectional LSTM neural network[J]. Atmospheric Pollution Research, 2021, 12(1): 328-339. |
21 | Wu H, Zhao J S. Deep convolutional neural network model based chemical process fault diagnosis[J]. Computers & Chemical Engineering, 2018, 115: 185-197. |
22 | Chen Y H, Peng G L, Zhu Z Y, et al. A novel deep learning method based on attention mechanism for bearing remaining useful life prediction[J]. Applied Soft Computing, 2020, 86: 105919. |
23 | 张浩, 刘振娟, 李宏光, 等. 基于关联变量时滞分析卷积神经网络的生产过程时间序列预测方法[J]. 化工学报, 2017, 68(9): 3501-3510. |
Zhang H, Liu Z J, Li H G, et al. Process time series prediction based on application of correlated process variables to CNN time delayed analyses[J]. CIESC Journal, 2017, 68(9): 3501-3510. | |
24 | 王丽亚, 刘昌辉, 蔡敦波, 等. 基于CNN-BiLSTM网络引入注意力模型的文本情感分析[J]. 武汉工程大学学报, 2019, 41(4): 386-391. |
Wang L Y, Liu C H, Cai D B, et al. Text sentiment analysis based on CNN-BiLSTM network and attention model[J]. Journal of Wuhan Institute of Technology, 2019, 41(4): 386-391. | |
25 | Yuan X F, Gu Y J, Wang Y L, et al. A deep supervised learning framework for data-driven soft sensor modeling of industrial processes[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 31(11): 4737-4746. |
26 | Zhang D H, Qian L Y, Mao B J, et al. A data-driven design for fault detection of wind turbines using random forests and XGboost[J]. IEEE Access, 2018, 6: 21020-21031. |
27 | Zhang X J, Zhang Q R. Short-term traffic flow prediction based on LSTM-XGBoost combination model[J]. Computer Modeling in Engineering & Sciences, 2020, 125(1): 95-109. |
28 | Zhu X L, Hao K R, Xie R M, et al. Soft sensor based on eXtreme gradient boosting and bidirectional converted gates long short-term memory self-attention network[J]. Neurocomputing, 2021, 434: 126-136. |
29 | Bisht H, Balachandran V S, Patel M, et al. Effect of composition of coke deposited in delayed coker furnace tubes on on-line spalling[J]. Fuel Processing Technology, 2018, 172: 133-141. |
30 | 李娜, 李朋. 延迟焦化加热炉技术现状[J]. 化工进展, 2020, 39(S2): 115-120. |
Li N, Li P. Present situation of coking heater technology[J]. Chemical Industry and Engineering Progress, 2020, 39(S2): 115-120. |
[1] | Linqi YAN, Zhenlei WANG. Multi-step predictive soft sensor modeling based on STA-BiLSTM-LightGBM combined model [J]. CIESC Journal, 2023, 74(8): 3407-3418. |
[2] | Gang YIN, Yihui LI, Fei HE, Wenqi CAO, Min WANG, Feiya YAN, Yu XIANG, Jian LU, Bin LUO, Runting LU. Early warning method of aluminum reduction cell leakage accident based on KPCA and SVM [J]. CIESC Journal, 2023, 74(8): 3419-3428. |
[3] | Yuying GUO, Jiaqiang JING, Wanni HUANG, Ping ZHANG, Jie SUN, Yu ZHU, Junxuan FENG, Hongjiang LU. Water-lubricated drag reduction and pressure drop model modification for heavy oil pipeline [J]. CIESC Journal, 2023, 74(7): 2898-2907. |
[4] | Yuan YU, Weiwei CHEN, Junjie FU, Jiaxiang LIU, Zhiwei JIAO. Study and prediction of flow field in the annular region of geometrically similar turbo air classifier [J]. CIESC Journal, 2023, 74(6): 2363-2373. |
[5] | Xuejin GAO, Yuzhuo YAO, Huayun HAN, Yongsheng QI. Fault monitoring of fermentation process based on attention dynamic convolutional autoencoder [J]. CIESC Journal, 2023, 74(6): 2503-2521. |
[6] | Weiming SHAO, Wenxue HAN, Wei SONG, Yong YANG, Can CHEN, Dongya ZHAO. Dynamic soft sensor modeling method based on distributed Bayesian hidden Markov regression [J]. CIESC Journal, 2023, 74(6): 2495-2502. |
[7] | Yanhui LI, Shaoming DING, Zhouyang BAI, Yinan ZHANG, Zhihong YU, Limei XING, Pengfei GAO, Yongzhen WANG. Corrosion micro-nano scale kinetics model development and application in non-conventional supercritical boilers [J]. CIESC Journal, 2023, 74(6): 2436-2446. |
[8] | Cheng YUN, Qianlin WANG, Feng CHEN, Xin ZHANG, Zhan DOU, Tingjun YAN. Deep-mining risk evolution path of chemical processes based on community structure [J]. CIESC Journal, 2023, 74(4): 1639-1650. |
[9] | Zhongqiu ZHANG, Hongguang LI, Yilin SHI. A multi-task learning approach for complex chemical processes based on manual predictive manipulating strategies [J]. CIESC Journal, 2023, 74(3): 1195-1204. |
[10] | Xinyuan WU, Qilei LIU, Boyuan CAO, Lei ZHANG, Jian DU. Group2vec: group vector representation and its property prediction applications based on unsupervised machine learning [J]. CIESC Journal, 2023, 74(3): 1187-1194. |
[11] | Kenian SHI, Jingyuan ZHENG, Yu QIAN, Siyu YANG. Two-stage stochastic programming of steam power system based on Markov chain [J]. CIESC Journal, 2023, 74(2): 807-817. |
[12] | Jiahui CHEN, Xinze YANG, Guzhong CHEN, Zhen SONG, Zhiwen QI. A critical discussion on developing molecular property prediction models: density of ionic liquids as example [J]. CIESC Journal, 2023, 74(2): 630-641. |
[13] | Yue HU, Shoujun MA, Xigao JIAN, Zhihuan WENG. Study on curing phthalonitrile resin with novel poly(phthalazinone ether nitrile) [J]. CIESC Journal, 2023, 74(2): 871-882. |
[14] | Xuejin GAO, Kun CHENG, Huayun HAN, Huihui Gao, Yongsheng QI. Fault diagnosis of chillers using central loss conditional generative adversarial network [J]. CIESC Journal, 2022, 73(9): 3950-3962. |
[15] | Jing YANG, Zhenkang LIN, Jun TANG, Cheng FAN, Kening SUN. A review of fault characteristics, fault diagnosis and identification for lithium-ion battery systems [J]. CIESC Journal, 2022, 73(8): 3394-3405. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||