Journal of System Simulation ›› 2024, Vol. 36 ›› Issue (7): 1573-1585.doi: 10.16182/j.issn1004731x.joss.22-1265
Guo Liqiang1(), Ma Liang1, Zhang Hui1, Yang Jing1, Li Lianfeng2, Zhai Yaqi1
Received:
2022-10-24
Revised:
2022-11-22
Online:
2024-07-15
Published:
2024-07-12
CLC Number:
Guo Liqiang, Ma Liang, Zhang Hui, Yang Jing, Li Lianfeng, Zhai Yaqi. Effective Position Intelligent Decision Method Based on Model Fusion and Generative Network[J]. Journal of System Simulation, 2024, 36(7): 1573-1585.
Table 3
Principle analysis of classification algorithm
算法 | 原理 | 优点 | 缺点 |
---|---|---|---|
LR | 将目标函数值映射到Sigmoid函数,通过极大似然函数进行分类 | 可解释性好,训练速度快 | 处理非线性和不平衡样本问题能力弱 |
KNN | 计算样本之间的距离后进行分类 | 模型结构简单,计算复杂度低[ | 处理高维特征能力不足 |
SVM | 通过核函数将样本映射到高维空间,采用超平面进行分类 | 对高维小样本数据学习能力较强 | 模型精度过分依赖核函数,计算量过大 |
MLP | 在前馈网络结构基础上,应用反向传播算法和激活函数构造的人工神经网络模型 | 能够处理非线性问题、鲁棒性较强[ | 受离散值影响较大,可解释性低 |
DT | 根据信息增益方向进行自上而下分类的树结构模型 | 模型分类规则具有可解释性 | 处理缺失数据能力弱、容易过拟合 |
RF | 基于Bagging理论和CART树的集成模型[ | 处理高维数据能力强,对噪声和异常值有较好的容忍性 | 处理小样本数据和低维数据能力弱、可解释性差 |
AdaBoost | 采用自适应增加预测错误样本权重的方式实现Boosting | 模型结构简单,使用灵活,抗过拟合能力强[ | 易受噪声干扰,训练耗时长 |
GBDT | 采用梯度下降的思想对残差进行优化[ | 可以灵活地处理多种类型特征 | 难以并行化处理,计算复杂度高 |
XGBoost | 在GBDT基础上对损失函数进行二阶泰勒展开并添加正则项 | 支持并行计算,内置交叉验证,模型泛化能力强 | 调参困难,内存消耗大[ |
LightGBM | 采用直方图、梯度单边采样算法和带深度限制的分支策略[ | 训练效率高,内存消耗少 | 对噪声较为敏感 |
CatBoost | 以对称二叉树为基模型,采用防止梯度估计偏差的排序提升方法[ | 离散型特征,调参成本低,模型精度高,预测快 | 离散型特征的处理消耗大量的内存和时间 |
Table 4
Classification model performance evaluation results
模型 | 准确率 | 召回率 | 精确率 | F1 | AUC | |
---|---|---|---|---|---|---|
LR | 训练集 | 0.823 | 0.500 | 0.651 | 0.566 | 0.859 |
测试集 | 0.825 | 0.503 | 0.659 | 0.570 | 0.856 | |
MLP | 训练集 | 0.839 | 0.545 | 0.695 | 0.611 | 0.884 |
测试集 | 0.839 | 0.546 | 0.694 | 0.611 | 0.882 | |
DT | 训练集 | 0.816 | 0.536 | 0.686 | 0.490 | 0.831 |
测试集 | 0.814 | 0.531 | 0.612 | 0.568 | 0.825 | |
RF | 训练集 | 0.828 | 0.626 | 0.629 | 0.627 | 0.872 |
测试集 | 0.825 | 0.655 | 0.613 | 0.633 | 0.870 | |
AdaBoost | 训练集 | 0.836 | 0.537 | 0.685 | 0.602 | 0.880 |
测试集 | 0.835 | 0.532 | 0.686 | 0.599 | 0.878 | |
GBDT | 训练集 | 0.841 | 0.528 | 0.771 | 0.606 | 0.877 |
测试集 | 0.840 | 0.524 | 0.708 | 0.602 | 0.883 | |
XGBoost | 训练集 | 0.805 | 0.803 | 0.553 | 0.655 | 0.892 |
测试集 | 0.803 | 0.787 | 0.552 | 0.649 | 0.889 | |
LightGBM | 训练集 | 0.798 | 0.821 | 0.541 | 0.653 | 0.893 |
测试集 | 0.794 | 0.810 | 0.537 | 0.646 | 0.889 | |
CatBoost | 训练集 | 0.832 | 0.696 | 0.623 | 0.657 | 0.893 |
测试集 | 0.829 | 0.693 | 0.616 | 0.652 | 0.890 |
Table 5
Performance evaluation results of fusion model
模型 | 准确率 | 召回率 | 精确率 | F1 | AUC | |
---|---|---|---|---|---|---|
Voting | 训练集 | 0.824 | 0.736 | 0.596 | 0.658 | 0.892 |
测试集 | 0.821 | 0.732 | 0.591 | 0.654 | 0.888 | |
Stacking | 训练集 | 0.843 | 0.601 | 0.684 | 0.640 | 0.894 |
测试集 | 0.840 | 0.595 | 0.674 | 0.632 | 0.891 | |
Blending | 训练集 | 0.845 | 0.598 | 0.690 | 0.641 | 0.894 |
测试集 | 0.837 | 0.579 | 0.672 | 0.622 | 0.888 | |
gcForest | 训练集 | 0.836 | 0.509 | 0.698 | 0.588 | 0.873 |
测试集 | 0.837 | 0.515 | 0.701 | 0.594 | 0.874 |
Table 6
Performance evaluation results of imbalanced classification architecture
模型 | 准确率 | 召回率 | 精确率 | F1 | AUC | |
---|---|---|---|---|---|---|
Stacking | 训练集 | 0.843 | 0.603 | 0.680 | 0.640 | 0.894 |
测试集 | 0.839 | 0.603 | 0.674 | 0.631 | 0.891 | |
SMOTE-Stacking | 训练集 | 0.891 | 0.895 | 0.889 | 0.892 | 0.964 |
测试集 | 0.830 | 0.666 | 0.625 | 0.645 | 0.886 | |
CGAN- Stacking | 训练集 | 0.877 | 0.827 | 0.860 | 0.843 | 0.952 |
测试集 | 0.831 | 0.638 | 0.651 | 0.644 | 0.891 | |
ICGAN- Stacking | 训练集 | 0.872 | 0.847 | 0.835 | 0.841 | 0.951 |
测试集 | 0.833 | 0.644 | 0.678 | 0.646 | 0.909 |
1 | 司广宇, 苗艳, 李关防. 水下立体攻防体系构建技术[J]. 指挥控制与仿真, 2018, 40(1): 1-8. |
Si Guangyu, Miao Yan, Li Guanfang. Underwater Tridimensional Attack-defense System Technology[J]. Command Control and Simulation, 2018, 40(1): 1-8. | |
2 | 何玉庆, 秦天一, 王楠. 跨域协同:无人系统技术发展和应用新趋势[J]. 无人系统技术, 2021, 4(4): 1-13. |
He Yuqing, Qin Tianyi, Wang Nan. Cross-domain Collaboration: New Trends in the Development and Application of Unmanned Systems Technology[J]. Unmanned Systems Technology, 2021, 4(4): 1-13. | |
3 | 王雅琳, 杨依然, 王彤, 等. 2019年无人系统领域发展综述[J]. 无人系统技术, 2019, 2(6): 53-57. |
Wang Yalin, Yang Yiran, Wang Tong, et al. Summary of the Development of Unmanned Systems in 2019[J]. Unmanned Systems Technology, 2019, 2(6): 53-57. | |
4 | 李磊, 王彤, 蒋琪. 从美军2042年无人系统路线图看无人系统关键技术发展动向[J]. 无人系统技术, 2018, 1(4): 79-84. |
Li Lei, Wang Tong, Jiang Qi. Key Technology Develop Trends of Unmanned Systems Viewed from Unmanned Systems Integrated Roadmap 2017-2042[J]. Unmanned Systems Technology, 2018, 1(4): 79-84. | |
5 | 周光霞, 周方. 美军人工智能空战系统阿尔法初探[C]//第六届中国指挥控制大会论文集(上册). 北京: 电子工业出版社, 2018: 66-70. |
6 | 王建丽, 张渭育. 统计学[M]. 北京: 清华大学出版社, 2010: 215-220. |
7 | 周志华. 机器学习[M]. 北京: 清华大学出版社, 2016. |
Zhou Zhihua. Machine Learning[M]. Beijing: Tsinghua University Press, 2016. | |
8 | Quinlan J R. Induction of Decision Trees[J]. Machine Learning, 1986, 1(1): 81-106. |
9 | Olson R S, Moore J H. Identifying and Harnessing the Building Blocks of Machine Learning Pipelines for Sensible Initialization of a Data Science Automation Tool[M]//Riolo R, Worzel B, Goldman B, et al. Genetic Programming Theory and Practice XIV. Cham: Springer International Publishing, 2018: 211-223. |
10 | 冯国双. 白话统计[M]. 北京: 电子工业出版社, 2018. |
11 | Peng Hanchuan, Long Fuhui, Ding C. Feature Selection Based on Mutual Information: Criteria of Max-dependency, Max-relevance, and Min-redundancy[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(8): 1226-1238. |
12 | Cortes C, Vapnik V. Support-vector Networks[J]. Machine Learning, 1995, 20(3): 273-297. |
13 | Narasimhamurthy A M. A Framework for the Analysis of Majority Voting[C]//Image Analysis. Berlin: Springer Berlin Heidelberg, 2003: 268-274. |
14 | Wolpert D H. Stacked Generalization[J]. Neural Networks, 1992, 5(2): 241-259. |
15 | Kuo C C J. Understanding Convolutional Neural Networks with a Mathematical Model[J]. Journal of Visual Communication and Image Representation, 2016, 41: 406-413. |
16 | 夏恒, 汤健, 乔俊飞. 深度森林研究综述[J]. 北京工业大学学报, 2022, 48(2): 182-196. |
Xia Heng, Tang Jian, Qiao Junfei. Review of Deep Forest[J]. Journal of Beijing University of Technology, 2022, 48(2): 182-196. | |
17 | Goodfellow I, Pouget-Abadie J, Mirza M, et al. Generative Adversarial Networks[J]. Communications of the ACM, 2020, 63(11): 139-144. |
18 | Kingma Diederik P, Welling Max. Auto-encoding Variational Bayes[C]//ICLR 2014. New York, USA: ICLR, 2014. |
19 | Radford A, Metz L, Chintala S. Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks[EB/OL]. (2016-01-07)[2022-07-08]. . |
20 | Dumoulin Vincent, Visin Francesco. A Guide to Convolution Arithmetic for Deep Learning[EB/OL]. (2018-01-11) [2022-07-22]. . |
21 | Mirza Mehdi, Osindero S. Conditional Generative Adversarial Nets[EB/OL]. (2014-11-06) [2022-07-15]. . |
22 | Cover T M, Hart P E. Nearest Neighbor Pattern Classification[J]. IEEE Transactions on Information Theory, 1967, 13(1): 21-27. |
23 | Goodfellow Ian, Bengio Yoshua, Courville Aaron. Deep Learning[M]. Cambridge: MIT Press, 2016: 106-140. |
24 | Breiman L. Random Forests[J]. Machine Learning, 2001, 45(1): 5-32. |
25 | Freund Y, Schapire R E. A Decision-theoretic Generalization of on-line Learning and an Application to Boosting[J]. Journal of Computer and System Sciences, 1997, 55(1): 119-139. |
26 | Friedman J H. Greedy Function Approximation: A Gradient Boosting Machine[J]. Annals of Statistics, 2001, 29(5): 1189-1232. |
27 | Chen Tianqi, Guestrin C. XGBoost: A Scalable Tree Boosting System[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: Association for Computing Machinery, 2016: 785-794. |
28 | Ke Guolin, Meng Qi, Finley T, et al. LightGBM: A Highly Efficient Gradient Boosting Decision Tree[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY, USA: Curran Associates Inc., 2017: 3149-3157. |
29 | Prokhorenkova Liudmila, Gusev Gleb, Vorobev Aleksandr, et al. CatBoost: Unbiased Boosting with Categorical Features[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems. Red Hook, NY, USA: Curran Associates Inc., 2018: 6639-6649. |
30 | Akiba Takuya, Sano Shotaro, Yanase Toshihiko, et al. Optuna: A Next-generation Hyperparameter Optimization Framework[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York, NY, USA: Association for Computing Machinery, 2019: 2623-2631. |
31 | Ozaki Yoshihiko, Tanigaki Yuki, Watanabe Shuhei, et al. Multiobjective Tree-structured Parzen Estimator for Computationally Expensive Optimization Problems[C]//GECCO 2020-Proceedings of the 2020 Genetic and Evolutionary Computation Conference. New York, NY, USA: Association for Computing Machinery, 2020: 533-541. |
32 | Simonyan K, Zisserman A. Very Deep Convolutional Networks for Large-scale Image Recognition[EB/OL]. (2015-04-10) [2022-08-03]. . |
33 | Gu Jiuxiang, Wang Zhenhua, Kuen J, et al. Recent Advances in Convolutional Neural Networks[J]. Pattern Recognition, 2018, 77: 354-377. |
34 | Ioffe S, Szegedy C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift[C]//Proceedings of the 32nd International Conference on International Conference on Machine Learning-Volume 37. Cambridge: JMLR, 2015: 448-456. |
35 | Santurkar S, Tsipras D, Ilyas A, et al. How Does Batch Normalization Help Optimization?[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems. Red Hook, NY, USA: Curran Associates Inc., 2018: 2488-2498. |
36 | Kingma Diederik P, Ba J L. Adam: A Method for Stochastic Optimization[EB/OL]. (2017-01-30) [2022-07-29]. . |
37 | 石洪波, 陈雨文, 陈鑫. SMOTE过采样及其改进算法研究综述[J]. 智能系统学报, 2019, 14(6): 1073-1083. |
Shi Hongbo, Chen Yuwen, Chen Xin. Summary of Research on SMOTE Oversampling and Its Improved Algorithms[J]. CAAI Transactions on Intelligent Systems, 2019, 14(6): 1073-1083. |
[1] | Liu Wanjun, Cheng Yuqian, Qu Haicheng. Image Self-enhancement De-hazing Algorithm Combined with Generative Adversarial Network [J]. Journal of System Simulation, 2024, 36(5): 1093-1106. |
[2] | Zhang Fengquan, Cao Duo, Ma Xiaohan, Chen Baijun, Zhang Jiangxiao. Style Transfer Network for Generating Opera Makeup Details [J]. Journal of System Simulation, 2023, 35(9): 2064-2076. |
[3] | Xingquan Cai, Zhijun Li, Mengyao Xi, Haiyan Sun. Costume Pattern Sketch Colorization and Style Transfer Based on Neural Network [J]. Journal of System Simulation, 2023, 35(3): 604-615. |
[4] | Xu Kang, Xiaofeng Zhang. Radar Remote Sensing Data Augmentation Method Based on Generative Adversarial Network [J]. Journal of System Simulation, 2022, 34(4): 920-927. |
[5] | Hu Lei, Wang Zugen, Chen Tian, Zhang Yongmei. An Improved SRGAN Infrared Image Super-Resolution Reconstruction Algorithm [J]. Journal of System Simulation, 2021, 33(9): 2109-2118. |
[6] | Cheng Wencong, Shi Xiaokang, Wang Zhigang. Creating Synthetic Satellite Cloud Data Based on GAN Method [J]. Journal of System Simulation, 2021, 33(6): 1297-1306. |
[7] | Li Xinli, Zou Changming, Yang Guotian, Liu He. Research of Super-resolution Processing of Invoice Image Based on Generative Adversarial Network [J]. Journal of System Simulation, 2021, 33(6): 1307-1314. |
[8] | Zheng Hong, Ye Cheng, Deng Wenxuan, Pan Li. A Game Cheating Detection System Based on Petri Nets [J]. Journal of System Simulation, 2020, 32(3): 455-463. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||