• •
收稿日期:
2024-06-20
修回日期:
2024-07-27
出版日期:
2024-08-13
通讯作者:
郝矿荣
作者简介:
王绍吉(2000—),男,硕士研究生,2221995@mail.dhu.edu.cn
Shaoji WANG1,2(), Kuangrong HAO1,2(), Lei CHEN1,2
Received:
2024-06-20
Revised:
2024-07-27
Online:
2024-08-13
Contact:
Kuangrong HAO
摘要:
在聚酯纤维生产过程中,酯化温度的精确控制至关重要。然而,由于生产设备和工艺参数的差异,传统预测方法难以满足个性化需求,且数据共享过程中存在隐私泄露和通信压力问题,提出了一种基于联邦学习的个性化自适应酯化温度时间序列预测算法。采用联合预测机制为每个客户端分配私有和共享模型,设置自适应阶段根据客户端数据分布智能调整模型参数,并学习客户端独有的联合预测权重,实现个性化预测输出。采用贝叶斯优化算法解决高维复杂和资源限制问题,快速高效地得到了最佳超参数组合。在三家聚酯纤维生产厂家的真实数据集上的广泛实验结果表明,算法在五种预测模型下均取得了最佳预测性能,有效提高了酯化温度预测的准确性。
中图分类号:
王绍吉, 郝矿荣, 陈磊. 基于联邦学习的聚酯纤维酯化过程温度预测研究[J]. 化工学报, DOI: 10.11949/0438-1157.20240689.
Shaoji WANG, Kuangrong HAO, Lei CHEN. Research on temperature forecasting of polyester fiber esterification process based on federated learning[J]. CIESC Journal, DOI: 10.11949/0438-1157.20240689.
工厂 | 起始时间 | 终止时间 | 总 样本量 | 训练集 样本量 | 验证集 样本量 | 测试集 样本量 | 变量 维度 |
---|---|---|---|---|---|---|---|
工厂A | 2012-03-01 00:00:00 | 2012-04-22 01:50:00 | 14999 | 10499 | 1499 | 3001 | 20 |
工厂B | 2016-11-07 00:00:00 | 2016-11-20 21:19:00 | 20000 | 14000 | 2000 | 4000 | 20 |
工厂C | 2022-02-01 00:05:00 | 2022-02-01 20:32:00 | 2000 | 1400 | 200 | 400 | 20 |
表1 PE数据集详细信息
Table 1 Details of the PE dataset
工厂 | 起始时间 | 终止时间 | 总 样本量 | 训练集 样本量 | 验证集 样本量 | 测试集 样本量 | 变量 维度 |
---|---|---|---|---|---|---|---|
工厂A | 2012-03-01 00:00:00 | 2012-04-22 01:50:00 | 14999 | 10499 | 1499 | 3001 | 20 |
工厂B | 2016-11-07 00:00:00 | 2016-11-20 21:19:00 | 20000 | 14000 | 2000 | 4000 | 20 |
工厂C | 2022-02-01 00:05:00 | 2022-02-01 20:32:00 | 2000 | 1400 | 200 | 400 | 20 |
模型 | 架构 | 参数维度 | 总参数量 |
---|---|---|---|
GRU | GRU层 | GRU(20, 128) | 57729 |
全连接层 | Linear(128, 1) | ||
LSTM | LSTM层 | LSTM(20, 128) | 76929 |
全连接层 | Linear(128, 1) | ||
RNN | RNN层 | RNN(20, 128) | 19329 |
全连接层 | Linear(128, 1) | ||
MLP | 全连接层-1 | Linear(240, 128) | 39169 |
全连接层-2 | Linear(128, 64) | ||
全连接层-3 | Linear(64, 1) | ||
CNN | 卷积层-1 | Conv1d(12, 16, kernel_size=2, stride=1) | 9265 |
卷积层-2 | Conv1d(16, 32, kernel_size=1, stride=1) | ||
池化层 | Maxpool1d(kernel_size=2, stride=2 , padding=0, dilation=1) | ||
全连接层-1 | Linear(128, 64) | ||
全连接层-2 | Linear(64, 1) |
表 2 模型架构及参数详细信息
Table 2 Details of the model architecture and parameters
模型 | 架构 | 参数维度 | 总参数量 |
---|---|---|---|
GRU | GRU层 | GRU(20, 128) | 57729 |
全连接层 | Linear(128, 1) | ||
LSTM | LSTM层 | LSTM(20, 128) | 76929 |
全连接层 | Linear(128, 1) | ||
RNN | RNN层 | RNN(20, 128) | 19329 |
全连接层 | Linear(128, 1) | ||
MLP | 全连接层-1 | Linear(240, 128) | 39169 |
全连接层-2 | Linear(128, 64) | ||
全连接层-3 | Linear(64, 1) | ||
CNN | 卷积层-1 | Conv1d(12, 16, kernel_size=2, stride=1) | 9265 |
卷积层-2 | Conv1d(16, 32, kernel_size=1, stride=1) | ||
池化层 | Maxpool1d(kernel_size=2, stride=2 , padding=0, dilation=1) | ||
全连接层-1 | Linear(128, 64) | ||
全连接层-2 | Linear(64, 1) |
算法 | 特殊超参数搜索范围 |
---|---|
Localised | - |
FedAvg | - |
FedAvg_FT | - |
APFL | - |
APPLE | - |
FedProx | |
FedRep | |
FML | |
FedPatsf | |
表3 算法特殊超参数及搜索范围
Table 3 Special hyperparameters and search ranges of algorithms
算法 | 特殊超参数搜索范围 |
---|---|
Localised | - |
FedAvg | - |
FedAvg_FT | - |
APFL | - |
APPLE | - |
FedProx | |
FedRep | |
FML | |
FedPatsf | |
公共超参数 | 搜索范围 |
---|---|
通讯轮数 | [ |
学习率 | [0.001,0.01] |
动量 | [0.5,0.9] |
批大小 | [ |
窗口大小 | [ |
隐藏层层数 | [ |
隐藏层大小 | [ |
本地训练轮数 | [ |
表4 算法公共超参数及搜索范围
Table 4 Common hyperparameters and search ranges for algorithms
公共超参数 | 搜索范围 |
---|---|
通讯轮数 | [ |
学习率 | [0.001,0.01] |
动量 | [0.5,0.9] |
批大小 | [ |
窗口大小 | [ |
隐藏层层数 | [ |
隐藏层大小 | [ |
本地训练轮数 | [ |
算法 | GRU | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.0029 | 0.0400 | 0.0283 |
FedAvg | 0.0032 | 0.0407 | 0.0291 |
FedAvg_FT | 0.0036 | 0.0435 | 0.0318 |
FedProx | 0.0099 | 0.0799 | 0.0676 |
FedRep | 0.0037 | 0.0484 | 0.0402 |
FML | 0.0029 | 0.0391 | 0.0283 |
APFL | 0.0034 | 0.0420 | 0.0305 |
APPLE | 0.0036 | 0.0418 | 0.0306 |
Ours | 0.0026 | 0.0359 | 0.0262 |
表5 不同算法在PE数据集上GRU模型预测对比
Table 5 GRU model forecasting comparison of different algorithms on PE dataset
算法 | GRU | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.0029 | 0.0400 | 0.0283 |
FedAvg | 0.0032 | 0.0407 | 0.0291 |
FedAvg_FT | 0.0036 | 0.0435 | 0.0318 |
FedProx | 0.0099 | 0.0799 | 0.0676 |
FedRep | 0.0037 | 0.0484 | 0.0402 |
FML | 0.0029 | 0.0391 | 0.0283 |
APFL | 0.0034 | 0.0420 | 0.0305 |
APPLE | 0.0036 | 0.0418 | 0.0306 |
Ours | 0.0026 | 0.0359 | 0.0262 |
算法 | LSTM | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.0041 | 0.0466 | 0.0364 |
FedAvg | 0.0029 | 0.0373 | 0.0276 |
FedAvg_FT | 0.0029 | 0.0399 | 0.0301 |
FedProx | 0.0072 | 0.0726 | 0.0615 |
FedRep | 0.0035 | 0.0443 | 0.0338 |
FML | 0.0035 | 0.0439 | 0.0326 |
APFL | 0.0029 | 0.0399 | 0.0301 |
APPLE | 0.0041 | 0.0498 | 0.0371 |
Ours | 0.0027 | 0.0352 | 0.0257 |
表6 不同算法在PE数据集上LSTM模型预测对比
Table 6 LSTM model forecasting comparison of different algorithms on PE dataset
算法 | LSTM | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.0041 | 0.0466 | 0.0364 |
FedAvg | 0.0029 | 0.0373 | 0.0276 |
FedAvg_FT | 0.0029 | 0.0399 | 0.0301 |
FedProx | 0.0072 | 0.0726 | 0.0615 |
FedRep | 0.0035 | 0.0443 | 0.0338 |
FML | 0.0035 | 0.0439 | 0.0326 |
APFL | 0.0029 | 0.0399 | 0.0301 |
APPLE | 0.0041 | 0.0498 | 0.0371 |
Ours | 0.0027 | 0.0352 | 0.0257 |
算法 | RNN | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.0027 | 0.0359 | 0.0245 |
FedAvg | 0.0024 | 0.0353 | 0.0241 |
FedAvg_FT | 0.0027 | 0.0383 | 0.0278 |
FedProx | 0.0070 | 0.0727 | 0.0610 |
FedRep | 0.0031 | 0.0416 | 0.0320 |
FML | 0.0027 | 0.0385 | 0.0275 |
APFL | 0.0026 | 0.0374 | 0.0260 |
APPLE | 0.0028 | 0.0378 | 0.0267 |
Ours | 0.0024 | 0.0348 | 0.0235 |
表7 不同算法在PE数据集上RNN模型预测对比
Table 7 RNN model forecasting comparison of different algorithms on PE dataset
算法 | RNN | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.0027 | 0.0359 | 0.0245 |
FedAvg | 0.0024 | 0.0353 | 0.0241 |
FedAvg_FT | 0.0027 | 0.0383 | 0.0278 |
FedProx | 0.0070 | 0.0727 | 0.0610 |
FedRep | 0.0031 | 0.0416 | 0.0320 |
FML | 0.0027 | 0.0385 | 0.0275 |
APFL | 0.0026 | 0.0374 | 0.0260 |
APPLE | 0.0028 | 0.0378 | 0.0267 |
Ours | 0.0024 | 0.0348 | 0.0235 |
算法 | MLP | |||
---|---|---|---|---|
MSE | RMSE | MAE | ||
Localised | 0.014 | 0.0867 | 0.0714 | |
FedAvg | 0.0058 | 0.0572 | 0.0445 | |
FedAvg_FT | 0.0045 | 0.0533 | 0.0421 | |
FedProx | 0.015 | 0.0928 | 0.0835 | |
FedRep | 0.0069 | 0.0652 | 0.0551 | |
FML | 0.0085 | 0.0729 | 0.0632 | |
APFL | 0.0045 | 0.0561 | 0.0481 | |
APPLE | 0.0067 | 0.0665 | 0.0542 | |
Ours | 0.0042 | 0.0487 | 0.0388 |
表8 不同算法在PE数据集上MLP模型预测对比
Table 8 MLP model forecasting comparison of different algorithms on PE dataset
算法 | MLP | |||
---|---|---|---|---|
MSE | RMSE | MAE | ||
Localised | 0.014 | 0.0867 | 0.0714 | |
FedAvg | 0.0058 | 0.0572 | 0.0445 | |
FedAvg_FT | 0.0045 | 0.0533 | 0.0421 | |
FedProx | 0.015 | 0.0928 | 0.0835 | |
FedRep | 0.0069 | 0.0652 | 0.0551 | |
FML | 0.0085 | 0.0729 | 0.0632 | |
APFL | 0.0045 | 0.0561 | 0.0481 | |
APPLE | 0.0067 | 0.0665 | 0.0542 | |
Ours | 0.0042 | 0.0487 | 0.0388 |
算法 | CNN | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.032 | 0.168 | 0.151 |
FedAvg | 0.031 | 0.154 | 0.133 |
FedAvg_FT | 0.034 | 0.176 | 0.155 |
FedProx | 0.026 | 0.144 | 0.126 |
FedRep | 0.032 | 0.163 | 0.144 |
FML | 0.039 | 0.192 | 0.170 |
APFL | 0.035 | 0.179 | 0.157 |
APPLE | 0.036 | 0.169 | 0.151 |
Ours | 0.024 | 0.142 | 0.120 |
表9 不同算法在PE数据集上CNN模型预测对比
Table 9 CNN model forecasting comparison of different algorithms on PE dataset
算法 | CNN | ||
---|---|---|---|
MSE | RMSE | MAE | |
Localised | 0.032 | 0.168 | 0.151 |
FedAvg | 0.031 | 0.154 | 0.133 |
FedAvg_FT | 0.034 | 0.176 | 0.155 |
FedProx | 0.026 | 0.144 | 0.126 |
FedRep | 0.032 | 0.163 | 0.144 |
FML | 0.039 | 0.192 | 0.170 |
APFL | 0.035 | 0.179 | 0.157 |
APPLE | 0.036 | 0.169 | 0.151 |
Ours | 0.024 | 0.142 | 0.120 |
变量 | 贡献度 |
---|---|
酯化釜温度调节 | 1.0 |
气相管线温度 | 0.25 |
分离塔顶部温度 | 0.09 |
酯化分离水回流量控制 | 0.06 |
酯化釜液位 | 0.05 |
热水循环泵频率调节 | 0.04 |
水回流泵出口压力 | 0.03 |
表 10 酯化温度预测变量贡献表
Table 10 Table of variables’ contributions to the forecasting of esterification temperature
变量 | 贡献度 |
---|---|
酯化釜温度调节 | 1.0 |
气相管线温度 | 0.25 |
分离塔顶部温度 | 0.09 |
酯化分离水回流量控制 | 0.06 |
酯化釜液位 | 0.05 |
热水循环泵频率调节 | 0.04 |
水回流泵出口压力 | 0.03 |
消融算法 | MSE | RMSE | MAE |
---|---|---|---|
FedPatsf_Tune | 0.0026 | 0.0358 | 0.0246 |
FedPatsf_Lw | 0.0025 | 0.0355 | 0.0244 |
FedPatsf_Joint | 0.0028 | 0.0369 | 0.0260 |
FedPatsf | 0.0024 | 0.0348 | 0.0235 |
表11 消融实验
Table 11 The FedPatsf algorithm ablation experiment
消融算法 | MSE | RMSE | MAE |
---|---|---|---|
FedPatsf_Tune | 0.0026 | 0.0358 | 0.0246 |
FedPatsf_Lw | 0.0025 | 0.0355 | 0.0244 |
FedPatsf_Joint | 0.0028 | 0.0369 | 0.0260 |
FedPatsf | 0.0024 | 0.0348 | 0.0235 |
1 | Mahmoud A, Mohammed A. A survey on deep learning for time-series forecasting[M]//Hassanien AE, Darwish A. Machine Learning and Big Data Analytics Paradigms: Analysis, Applications and Challenges. Cham: Springer, 2021: 365-392. |
2 | Schmidhuber J. Deep learning in neural networks: an overview[J]. Neural Networks, 2015, 61: 85-117. |
3 | 刘立, 蒋鹏, 王伟, 等. 基于过程模拟和随机森林模型的生物质制氢过程因素分析与预测[J]. 化工学报, 2022, 73(11): 5230-5239. |
Liu L, Jiang P, Wang W, et al. Coupling process simulation and random forest model for analyzing and predicting biomass-to-hydrogen conversion[J]. CIESC Journal, 2022, 73(11): 5230-5239. | |
4 | 马欣蔚, 牟兴森, 龙珠, 等. 基于GA-BP神经网络的横管降膜蒸发传热系数预测[J]. 化工学报, 2023, 74(12): 4840-4851. |
Ma X W, Mu X S, Long Z, et al. Prediction of heat transfer coefficient of horizontal tube falling film evaporation based on GA-BP neural network[J]. CIESC Journal, 2023, 74(12): 4840-4851. | |
5 | 方黄峰, 刘瑶瑶, 张文彪. 基于LSTM神经网络的流化床干燥器内生物质颗粒湿度预测[J]. 化工学报, 2020, 71(S1): 307-314. |
Fang H F, Liu Y Y, Zhang W B. Biomass moisture content prediction in fluidized bed dryer based on LSTM neural network[J]. CIESC Journal, 2020, 71(S1): 307-314. | |
6 | 袁壮, 凌逸群, 杨哲, 等. 基于TA-ConvBiLSTM的化工过程关键工艺参数预测[J]. 化工学报, 2022, 73(1): 342-351. |
Yuan Z, Ling Y Q, Yang Z, et al. Critical parameters prediction based on TA-ConvBiLSTM for chemical process[J]. CIESC Journal, 2022, 73(1): 342-351. | |
7 | Chiranjeevi M, Karlamangal S, Moger T, et al. Solar irradiation prediction hybrid framework using regularized convolutional BiLSTM-based autoencoder approach[J]. IEEE Access, 2023, 11: 131362-131375. |
8 | Yang Y X, Gao P, Sun Z T, et al. Multistep ahead prediction of temperature and humidity in solar greenhouse based on FAM-LSTM model[J]. Computers and Electronics in Agriculture, 2023, 213: 108261. |
9 | Zaboli A, Tuyet-Doan V N, Kim Y H, et al. An LSTM-SAE-based behind-the-meter load forecasting method[J]. IEEE Access, 2023, 11: 49378-49392. |
10 | Semmelmann L, Henni S, Weinhardt C. Load forecasting for energy communities: a novel LSTM-XGBoost hybrid model based on smart meter data[J]. Energy Informatics, 2022, 5(1): 24. |
11 | Akilan T, Baalamurugan K M. Automated weather forecasting and field monitoring using GRU-CNN model along with IoT to support precision agriculture[J]. Expert Systems with Applications, 2024, 249: 123468. |
12 | Zubair M, Salim M S, Rahman M M, et al. Agricultural recommendation system based on deep learning: a multivariate weather forecasting approach[EB/OL]. 2024: 2401.11410. |
13 | Gajamannage K, Park Y, Jayathilake D I. Real-time forecasting of time series in financial markets using sequentially trained dual-LSTMs[J]. Expert Systems with Applications, 2023, 223: 119879. |
14 | Lazcano A, Herrera P J, Monge M. A combined model based on recurrent neural networks and graph convolutional networks for financial time series forecasting[J]. Mathematics, 2023, 11(1): 224. |
15 | Shi J T, Gao J C, Chen L, et al. Online prediction based on the Copula function for the intrinsic viscosity of polyester fibre[J]. The Canadian Journal of Chemical Engineering, 2023, 101(3): 1440-1454. |
16 | Yin Z Q, Hao K R, Chen L, et al. Forecasting the intrinsic viscosity of polyester based on improved extreme learning machine[C]//2019 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA). Dalian, China. IEEE, 2019: 241-246. |
17 | Peng H Y, Chen L, Hao K R. Deep transfer model with source domain segmentation for polyester esterification processes[C]//2022 34th Chinese Control and Decision Conference (CCDC). Hefei, China. IEEE, 2022: 293-298. |
18 | Goldsteen A, Ezov G, Shmelkin R, et al. Data minimization for GDPR compliance in machine learning models[J]. AI and Ethics, 2022, 2(3): 477-491. |
19 | Li Y, Wang R N, Li Y Z, et al. Wind power forecasting considering data privacy protection: a federated deep reinforcement learning approach[J]. Applied Energy, 2023, 329: 120291. |
20 | McMahan H B, Moore E, Ramage D, et al. Communication-efficient learning of deep networks from decentralized data[EB/OL]. 2016: 1602.05629. |
21 | Zhang W S, Chen X, He K, et al. Semi-asynchronous personalized federated learning for short-term photovoltaic power forecasting[J]. Digital Communications and Networks, 2023, 9(5): 1221-1229. |
22 | He Y, Luo F J, Sun M Y, et al. Privacy-preserving and hierarchically federated framework for short-term residential load forecasting[J]. IEEE Transactions on Smart Grid, 2023, 14(6): 4409-4423. |
23 | Perifanis V, Pavlidis N, Koutsiamanis R A, et al. Federated learning for 5G base station traffic forecasting[J]. Computer Networks, 2023, 235: 109950. |
24 | Thwal C M, Tun Y L, Kim K, et al. Transformers with attentive federated aggregation for time series stock forecasting[C]//2023 International Conference on Information Networking (ICOIN). Bangkok, Thailand. IEEE, 2023: 499-504. |
25 | Zhu H Y, Xu J J, Liu S Q, et al. Federated learning on non-IID data: a survey[J]. Neurocomputing, 2021, 465: 371-390. |
26 | Yu T, Bagdasaryan E, Shmatikov V. Salvaging federated learning by local adaptation[J]. ArXiv e-Prints, 2020: arXiv: . |
27 | Li T, Sahu A K, Zaheer M, et al. Federated optimization in heterogeneous networks[J]. Proceedings of Machine Learning and Systems, 2020, 2: 429-450. |
28 | Collins L, Hassani H, Mokhtari A, et al. Exploiting shared representations for personalized federated learning[C]//Proceedings of the 38th International Conference on Machine Learning. PMLR, 2021, 139: 2089-2099. |
29 | Deng Y Y, Mahdi Kamani M, Mahdavi M. Adaptive personalized federated learning[J]. ArXiv e-Prints, 2020: arXiv: . |
30 | Shen T, Zhang J, Jia X K, et al. Federated mutual learning[EB/OL]. 2020: 2006.16765. |
31 | Luo J, Wu S. D Adapt to adaptation: learning personalization for cross-silo federated learning[J]. IJCAI: Proceedings of the Conference, 2022, 2022: 2166-2173. |
32 | Guyon I, Weston J, Barnhill S, et al. Gene selection for cancer classification using support vector machines[J]. Machine Learning, 2002, 46(1): 389-422. |
[1] | 杜海燕, 朱凯, 游峰, 王金凤, 赵一帆, 张楠, 李英. 用于应变传感器的自愈合抗冻离子水凝胶[J]. 化工学报, 2024, 75(7): 2709-2722. |
[2] | 张晗, 张淑宁, 刘珂, 邓冠龙. 基于慢特征分析与最小二乘支持向量回归集成的草酸钴合成过程粒度预报[J]. 化工学报, 2024, 75(6): 2313-2321. |
[3] | 何宇航, 谢丹, 吕阳成. 微反应器内阳离子聚合研究进展[J]. 化工学报, 2024, 75(4): 1302-1316. |
[4] | 张文惠, 唐茹意, 崔希利, 邢华斌. 羧酸端基Y型全氟聚醚的氟谱解析及结构表征[J]. 化工学报, 2024, 75(4): 1718-1723. |
[5] | 王瑞瑞, 金颖, 刘玉梅, 李梦悦, 朱胜文, 闫瑞一, 刘瑞霞. 聚合离子液体设计及催化环己烷选择性氧化性能研究[J]. 化工学报, 2024, 75(4): 1552-1564. |
[6] | 程骁恺, 历伟, 王靖岱, 阳永荣. 镍催化可控/活性自由基聚合反应研究进展[J]. 化工学报, 2024, 75(4): 1105-1117. |
[7] | 肖扬可, 常印龙, 李平, 王文俊, 李伯耿, 刘平伟. 动态化学交联聚烯烃类弹性体研究进展[J]. 化工学报, 2024, 75(4): 1394-1413. |
[8] | 刘静, 杨文博, 吕英迪, 陶胜洋. 喷雾-反溶剂结晶法制备掺杂铝粉的复合微球[J]. 化工学报, 2024, 75(4): 1724-1734. |
[9] | 吴立盛, 刘杰, 王添添, 罗正鸿, 周寅宁. 开环易位烯烃聚合物的动态交联改性研究进展[J]. 化工学报, 2024, 75(4): 1118-1136. |
[10] | 潘娜, 田昌, 怀兰坤, 刘玉玉, 张芬芬, 高晓梅, 刘伟, 闫良国, 赵艳侠. 聚合铝钛基絮凝剂的合成与应用[J]. 化工学报, 2024, 75(3): 1009-1018. |
[11] | 陈思睿, 毕景良, 王雷, 李元媛, 陆规. 气液两相流流型特征无监督提取的卷积自编码器:机理及应用[J]. 化工学报, 2024, 75(3): 847-857. |
[12] | 张领先, 刘斌, 邓琳, 任宇航. 基于改进TSO优化Xception的PEMFC故障诊断[J]. 化工学报, 2024, 75(3): 945-955. |
[13] | 成文凯, 颜金钰, 王嘉骏, 冯连芳. 卧式捏合反应器及其在聚合工业中的研究进展[J]. 化工学报, 2024, 75(3): 768-781. |
[14] | 蒙西, 王岩, 孙子健, 乔俊飞. 基于注意力模块化神经网络的城市固废焚烧过程氮氧化物排放预测[J]. 化工学报, 2024, 75(2): 593-603. |
[15] | 肖拥君, 时兆翀, 万仁, 宋璠, 彭昌军, 刘洪来. 反向传播神经网络用于预测离子液体的自扩散系数[J]. 化工学报, 2024, 75(2): 429-438. |
阅读次数 | ||||||||||||||||||||||||||||||||||||||||||||||||||
全文 209
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||
摘要 34
|
|
|||||||||||||||||||||||||||||||||||||||||||||||||