Wood Species Classification With Microscopic Hyper-Spectral Imaging Based on I-BGLAM Texture and Spectral Fusion
ZHAO Peng1,2*, HAN Jin-cheng1, WANG Cheng-kun1
1. School of Information and Computer Engineering, Northeast Forestry University, Harbin 150040, China
2. School of Computer Science and Communication Engineering, Guangxi University of Science and Technology, Liuzhou 545006, China
Abstract:To improve the accuracy of wood species classification, a method is proposed based on I-BGLAM (Improved-Basic Gray Level Aura Matrix) texture features and spectral features fusion in this paper. Experimental data are hyper-spectral images in the visible and near-infrared spectral band (i. e., 372.53~1 038.57 nm) obtained by SOC710VP hyper-spectral imaging system. Firstly, the feature band selection method based on OIF (Optimum Index Factor) was used to reduce the dimension of hyper-spectral images and select the band containing a large amount of information. Secondly, NSCT (Nonsubsampled Contourlet Transform) and inverse transformation of NSCT were used to obtain the fusion image for the selected band images, and I-BGLAM was used to extract its texture features for the obtained fusion image. At the same time, the average spectrum of the whole band of hyper-spectral image was obtained, and the spectral characteristics were obtained by S-G (Savitzky-Golay) smoothing. Finally, the obtained texture features and spectral features were fused and sent to ELM (Extreme Learning Machine) for classification. In addition, the method proposed in this paper is compared with the traditional method of wood identification based on GLCM (Gray Level Co-occurrence Matrix) and the mainstreams method proposed in the field of wood species identification in recent years. There are two main innovations in this paper. One is to use the strong texture extractor I-BGLAM to extract its texture features from hyper-spectral images; the other is to propose a new feature fusion model for the classification of hyper-spectral images. The experimental results of 8 tree species show that the accuracy of using I-BGLAM to extract texture features for classification was up to 88.54%, while the accuracy of using GLCM to extract texture features was up to 76.04%. The results show that the use of I-BGLAM in this paper is better than that of GLCM in texture feature extraction, which lays a good foundation for the fusion model established later. The accuracy of classification by using the average spectral features alone can reach 92.71%. The classification accuracy of the proposed feature fusion method can reach up to 100%. This shows that it is better to use the fusion model proposed in this paper for classification than to use the classification model of a single feature. In addition, the classification accuracy obtained by using the method proposed in this paper is higher than the other two mainstream recognition methods in this field. Therefore, the method proposed in this paper based on I-BGLAM texture feature and spectral feature fusion can improve the accuracy of wood species classification, which has certain utilization value in the classification of wood species.
Key words:Hyper-spectral imaging; I-BGLAM; Texture feature; Spectral feature; Feature fusion; Classification of wood species
赵 鹏,韩金城,王承琨. 基于I-BGLAM纹理和光谱融合的高光谱显微成像木材树种分类[J]. 光谱学与光谱分析, 2021, 41(02): 599-605.
ZHAO Peng, HAN Jin-cheng, WANG Cheng-kun. Wood Species Classification With Microscopic Hyper-Spectral Imaging Based on I-BGLAM Texture and Spectral Fusion. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2021, 41(02): 599-605.
[1] REN Hong-e, XU Hai-tao(任洪娥, 徐海涛). SCIENTIA SILVAE SINICAE(林业科学), 2007, 43(9): 68.
[2] Mäenpää T, Viertola J, Pietikäinen M. Pattern Analysis & Applications, 2003, 6(3): 169.
[3] WANG Hui, WANG Han, KONG Xiang-wei(王 辉, 王 晗, 孔祥维). Measurement & Control Technology(测控技术), 2015, 34(9): 28.
[4] BAI Xue-bing, WANG Ke-jun, ZOU Li-hui(白雪冰, 王科俊, 邹丽晖). Journal of Northeast Forestry University(东北林业大学学报), 2008, 36(12): 23.
[5] Qin X, Yang Y H. IEEE Computer Society Conference on Computer Vision & Pattern Recognition. IEEE, 2004.
[6] Qin X, Yang Y H. Tenth IEEE International Conference on Computer Vision. IEEE, 2005.
[7] Zamri, Mohd Iz’aan Paiz, et al. Computers and Electronics in Agriculture, 2016, 124: 227.
[8] Ramalho F M G, Andrade J M, Hein P R G. Forest Systems, 2018, 27(2): e008.
[9] HAO Yong, SHANG Qing-yuan, RAO Min, et al(郝 勇, 商庆园, 饶 敏,等). Spectroscopy and Spectral Analysis(光谱学与光谱分析), 2019, 39(3): 705.
[10] Nisgoski S, de Oliveira A A, de Muñiz G I B. Wood Science and Technology, 2017, 51(4): 929.
[11] Hycza T, Stereńczak K, Bałazy R. New Zealand Journal of Forestry Science, 2018, 48(1): 18.
[12] SUN Jun, JIN Xia-ming, MAO Han-ping, et al(孙 俊, 金夏明, 毛罕平,等). Transactions of the Chinese Society of Agricultural Engineering(农业工程学报), 2014, 30(10): 167.
[13] Zhao P, Wang C K. Journal of Spectroscopy, 2019, 2039453.
[14] Press W H, Teukolsky S A. Computers in Physics, 1990, 4(6): 669.
[15] Huang G B, Zhu Q Y, Siew C K. Neurocomputing, 2006, 70(1-3): 489.
[16] Chavez P S, Berlin G L, Sowers L. Journal of Applied Photographic Engineering, 1982, 8(1): 23.
[17] Do M N, Vetterli M. IEEE Transactions on Image Processing, 2005, 14(12): 2091.
[18] Yusof R, Khalid M, Khairuddin A S M. Computers and Electronics in Agriculture, 2013, 93: 68.
[19] Ibrahim I, Khairuddin A S M, Talip M S A, et al. Wood Science and Technology, 2017, 51(2): 431.