Deep Learning Modelling and Model Transfer for Near-Infrared Spectroscopy Quantitative Analysis
FU Peng-you1, 2, WEN Yue2, ZHANG Yu-ke3, LI Ling-qiao1*, YANG Hui-hua1, 2*
1. School of Computer Science and Information Security, Guilin University of Electronic Technology, Guilin 541004, China
2. School of Artificial Intelligence, Beijing University of Posts and Telecommunications, Beijing 100876, China
3. School of International, Beijing University of Posts and Telecommunications, Beijing 100876, China
Abstract:Near-infrared spectroscoqy analysis technologyrelies on Chemometric methods that characterize the relationships between the spectral matrix and the chemical or physical properties. However, the samples’ spectra are composed of signals and various noises. It is difficult for traditional Chemometric methods to extract the effective features of the spectra and establish a calibration model with strong generative performance for a complex assay. Furthermore, the same quantitative analysis results cannot be achieved when the calibration model established on one instrument is applied to another because of the differences between the instruments. Hence, this paper presents a quantitative analysis modeling and model transfer frameworkbased on convolution neural networks and transfer learning to improve model prediction performance on one instrument and across the instrument. An advanced model named MSRCNN is presented based on a convolutional neural network, which integrates multi-scale feature fusion and residual structure and shows outstanding model generalization performance on the master instrument. Then, four transfer learning methods based on fine-tuning are proposed to transfer the MSRCNN established on the master instrument to the slave instrument. The experimental results on open accessed datasets of drug and wheat show that the RMSE and R2 of MSRCNN on the master instrument are 2.587, 0.981, and 0.309, 0.977, respectively, which outperforms PLS, SVM, and CNN. Byusing 30 slave instrument samples, the transfer of the convolutional layer and fully connected layer in the MSRCNN model is the most effective among the four fine-tune methods, with RMSE and R2 2.289, 0.982, and 0.379, 0.965, respectively. The performance can be further improved by increasing the sample of slave instruments that participated in model transferring.
Key words:Near-infrared spectroscopy; Deep learning; Transfer learning; Multi-scale fusion; Residual convolution network; Model transfer
基金资助: supported by the National Natural Science Foundation of China (62262010, 61906050),Guangxi Technology R&D Program (2018AD11018),Innovation Project of GUET Graduate Education(2018YJCX44)
作者简介: FU Peng-you,(1998—), Master’s degree reading, School of Computer Science and Information Security, Guilin University of Electronic Technology e-mail: 20032201004@mails.guet.edu.cn
引用本文:
傅鹏有,文 岳,张雨柯,李灵巧,杨辉华. 面向近红外光谱定量分析的深度学习建模与模型迁移[J]. 光谱学与光谱分析, 2023, 43(01): 310-319.
FU Peng-you, WEN Yue, ZHANG Yu-ke, LI Ling-qiao, YANG Hui-hua. Deep Learning Modelling and Model Transfer for Near-Infrared Spectroscopy Quantitative Analysis. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2023, 43(01): 310-319.
[1] Ashie A, Lei H, Han B, et al. Infrared Physics & Technology, 2021, 113: 103629.
[2] Miao X, Miao Y, Gong H, et al. Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy, 2021, 257: 119700.
[3] Behkami S, Zain S M, Gholami M, et al. Food Chemistry, 2019, 294: 309.
[4] Li D, Wang A, Li Y, et al. Spectroscopy and Spectral Analysis, 2021, 41(2): 441.
[5] Huang X, Luo Y P, Xia L. Chemometrics and Intelligent Laboratory Systems, 2019, 194: 103872.
[6] Mishra P, Roger J M, Marini F, et al. Chemometrics and Intelligent Laboratory Systems, 2021, 212: 104190.
[7] Workman J J. Applied Spectroscopy, 2018, 72(3): 340.
[8] Malli B, Birlutiu A, Natschläger T. Chemometrics and Intelligent Laboratory Systems, 2017, 161: 49.
[9] Zhao Y, Zhao Z, Shan P, et al. Molecules, 2019, 24(9): 1802.
[10] Zou C, Zhu H, Shen J, et al. Analytical Methods, 2019, 11(35): 4481.
[11] Chen Y, Wang Z. Chemometrics and Intelligent Laboratory Systems, 2019, 192: 103824.
[12] Luo Y, Yang S, Tian H, et al. Infrared Physics & Technology, 2020, 104: 103053.
[13] Liu C, Yang S X, Li X, et al. Chemometrics and Intelligent Laboratory Systems, 2020, 201: 104014.
[14] Passos D, Mishra P. Chemometrics and Intelligent Laboratory Systems, 2021: 104354.
[15] Hu R, Zhang L, Yu Z, et al. Infrared Physics & Technology, 2019, 102: 102999.
[16] Acquarelli J, van Laarhoven T, Gerretzen J, et al. Analytica Chimica Acta, 2017, 954: 22.
[17] Zhang X, Lin T, Xu J, et al. Analytica Chimica Acta, 2019, 1058: 48.
[18] Dong J E, Wang Y, Zuo Z T, et al. Chemometrics and Intelligent Laboratory Systems, 2020, 197: 103913.
[19] Jiang D, Qi G, Hu G, et al. Infrared Physics & Technology, 2020, 111: 103494.
[20] Zhao Y, Yu J, Shan P, et al. Molecules, 2019, 24(7): 1289.
[21] Zhang F, Zhang R, Wang W, et al. Chemometrics and Intelligent Laboratory Systems, 2019, 195: 103896.
[22] Zhang J, Guo C, Cui X, et al. Analytica Chimica Acta, 2019, 1050: 25.
[23] Zhuang F, Qi Z, Duan K, et al. Proceedings of the IEEE, 2020, 109(1): 43.
[24] Jiang Y, Gu X, Wu D, et al. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2020, 18(1): 40.
[25] Xie H, Liu B, Xiao Y. Information Sciences, 2021, 547: 526.
[26] Zhang S, Sun F, Wang N, et al. Journal of Digital Imaging, 2019, 32(6): 995.
[27] Pikuliak M, Simko M, Bielikova M. Expert Systems with Applications, 2021, 165: 113765.
[28] Li L, Pan X, Chen W, et al. Journal of Innovative Optical Health Sciences, 2020, 13(4): 2050016.
[29] Mishra P, Passos D. Chemometrics and Intelligent Laboratory Systems, 2021, 212: 104283.
[30] Mishra P, Passos D. Infrared Physics & Technology, 2021, 117: 103863.
[31] Szegedy C, Liu W, Jia Y, et al. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2015: 1.
[32] He K, Zhang X, Ren S, et al. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2016: 770.
[33] Chen Y, Wang Z. Chemometrics and Intelligent Laboratory Systems, 2019, 191: 103.