Identification Method of Pollen Typhae Processed Products Based on Convolutional Neural Network and Voting Mechanism
CHEN Cheng-wu1, WANG Tian-shu1*, HU Kong-fa1, BAO Bei-hua2, YAN Hui2, YANG Xi-chen3
1. College of Artificial Intelligence and Information Technology, Nanjing University of Chinese Medicine, Nanjing 210029, China
2. College of Pharmacy, Nanjing University of Chinese Medicine, Nanjing 210029, China
3. School of Computer and Electronic Information/School of Artificial Intelligence, Nanjing Normal University, Nanjing 210023, China
Abstract:Carbonized Typhae Pollen (CTP) is processed by Pollen Typhae. It has various effects such as hemostasis, removing blood stasis and treating stranguria. It has been widely used in clinical anti-thrombosis, wounds and bleeding. However, in the process of CTP, the carbonization is often too light or too heavy, resulting in different degrees of CTP, mainly for light carbonization, standard carbonization and heavy carbonization CTP. Due to the different degrees of carbonization, the coagulation effect of CTP is different. The standard CTP has the best effect. At present, the identification of CTP mainly relies on eyes and experience. The manual method is challenging to distinguish the standard CTP because it is inefficient, volitional, and unstable. Therefore, to effectively identify CTP with different degrees of carbonization, a near-infrared identification method based on Convolutional Neural Network (CNN) and the voting mechanism is proposed. This method innovatively combines deep learning and machine learning algorithms, effectively utilizes the powerful representation extraction ability of CNN, and applies voting decisions to improve the generalization ability and robustness of the prediction model. The near-infrared spectrum of CTP is firstly obtained. Then the high-order features of the spectrum processed by four different pre-processing methods are extracted by CNN. Next, the prediction results are calculated. The weights of four pre-processing methods are allocated according to the accuracy and loss to get the prediction model. Finally, the model combines the four prediction results with the weights to identify the CTP with different degrees of carbonization. The experimental results show that the proposed method can effectively distinguish the CTP with different degrees of carbonization. When the training set occupies 80%, the test accuracy is up to 95.4%. Compared with CNN, Linear Discriminant Analysis (LDA) and Standard Normal Variable (SNV)-LDA, the proposed method improves the prediction accuracy by 8.6%, 4.3% and 2.6%, respectively. At the same time, the proposed method is robust. When the proportion of the training set occupies more than 70%, the test accuracy is higher than 90%. When the proportion of the training set only occupies 10%, the prediction accuracy can still reach about 80%.
陈承武,王天舒,胡孔法,包贝华,严 辉,杨曦晨. 基于卷积神经网络与投票机制的蒲黄炮制品近红外判别方法[J]. 光谱学与光谱分析, 2022, 42(11): 3361-3367.
CHEN Cheng-wu, WANG Tian-shu, HU Kong-fa, BAO Bei-hua, YAN Hui, YANG Xi-chen. Identification Method of Pollen Typhae Processed Products Based on Convolutional Neural Network and Voting Mechanism. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2022, 42(11): 3361-3367.
[1] Xie Y, Zhou R R, Xie H L, et al. International Journal of Biological Macromolecules, 2019, 122: 1115.
[2] LIU Ya-chao, LI Yong-yu, PENG Yan-kun, et al(刘亚超, 李永玉, 彭彦昆, 等). Chinese Journal of Analytical Chemistry(分析化学), 2019, 47(5): 785.
[3] Schimleck L R, Antony F, Mora C, et al. Holzforschung, 2020, 74(1): 20.
[4] Correia R M, Domingos E, Flávia Tosato, et al. Analytical Methods, 2018, 10(6): 593.
[5] Unger R, Braun U, Fankhänel, et al. Computational Materials Science, 2019, 161: 223.
[6] Kartakoullis A, Comaposada J, Cruz-Carrión A, et al. Food Chemistry, 2019, 278: 314.
[7] Cui Y, Ge W, Li J, et al. Computers and Electronics in Agriculture, 2019, 158: 358.
[8] Mandrile L, Barbosa-Pereira L, Sorensen K M, et al. Food Chemistry, 2019, 292: 47.
[9] Krizhevsky A, Sutskever I, Hinton G E, et al. Communications of the ACM, 2017, 60(6): 84.
[10] Bapu J J, Florinabel D J, Robinson Y H, et al. Earth Science Informatics, 2019, 12(4): 525.
[11] Bacchi S, Oakden-Rayner L, Zerner T, et al. Stroke, 2019, 50(3): 758.
[12] Nogovitsyn N, Souza R, Muller M, et al. NeuroImage, 2019, 197: 589.
[13] Wartini N, Budiman M, Maryam M, et al. Geoderma, 2019, 352: 251.
[14] Xu J, Luo X, Wang G, et al. Neurocomputing, 2016, 191: 214.
[15] Gao M, Bao B, Cao Y, et al. Molecules, 2019, 24: 128.