Research on Chalky Rice Detection Based on Visible Spectrogram and Deep Neural Network Technology
LIN Ping1, ZHANG Hua-zhe1, HE Jian-qiang1, ZOU Zhi-yong2, CHEN Yong-ming1*
1. College of Electrical Engineering, Yancheng Institute of Technology, Yancheng 224051, China
2. College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625014, China
Abstract:Aiming at the problems of subjective randomness, low repeatability, being time-consuming and low accuracy of traditional chalky rice detection, a new method based on visible spectrogram combined with deep learning algorithm is proposed to meet the requirement of rapid and accurate rice quality parameters in modern agricultural production. In the experiment, CCD color camera was used to obtain the visible spectra of chalky rice and normal rice. Random image transformation methods such as rotation, flipping and contrast adjustment were used to enhance the network training data set to prevent the fitting phenomenon of the depth detection model in the learning process. In this paper, seven deep-level convolution neural network models, including convolution layer, pooling layer, full-connection layer and input-output layer is constructed. The visible spectral image of rice is convoluted and pooled by network model. The characteristic parameters of visible spectral image of rice in convolution layer are obtained by iterative learning training method. The non-linear ReLU activation function is used to accelerate the convergence rate of the effective abstract feature extraction of rice; then the pool layer is employed to obtain the distinguishable semantic features that can express normal rice and chalky rice; finally, the data are transported into the full connection layer. The chalky rice can be identified accurately by classification. The method of rice chalkness detection based on convolution neural network eliminates the complicated steps of feature extraction compared with the traditional method. Because the features extracted by convolution network have more robust expression for specific targets, the algorithm has higher accuracy and less complexity, and the generalization effect is better than the traditional method based on visible spectrogram. The recognition accuracy is up to 90%. The recognition accuracy of SIFT+SVM, PHOG+SVM and GIST+SVM are 70.83%, 77.08% and 79.16% respectively. The proposed method provides a theoretical basis and effective technical means for the realization of automatic and accurate detection of rice quality in modern agricultural production. Therefore, this study has certain theoretical value and practical significance for the realization of artificial intelligence detection of rice quality.
Key words:Visible spectrogram; Rice; Chalkiness; Deep learning; Artificial intelligence
林 萍,张华哲,何坚强,邹志勇,陈永明. 可见光谱图与深度神经网络的垩白大米检测方法[J]. 光谱学与光谱分析, 2020, 40(01): 233-238.
LIN Ping, ZHANG Hua-zhe, HE Jian-qiang, ZOU Zhi-yong, CHEN Yong-ming. Research on Chalky Rice Detection Based on Visible Spectrogram and Deep Neural Network Technology. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2020, 40(01): 233-238.
[1] DING Hua, WANG Jing, YAN Wei,et al(丁 华,王 婧,严 伟,等). Hubei Agricultural Sciences(湖北农业科学),2015, 54(23): 6012.
[2] FANF Guo-zhi, XU Jiang-dong, WANG Quan,et al(房国志,徐建东,王 全,等). Journal of Optoelectronics Laser(光电子激光),2010,21(4):569.
[3] LIU Ying-ying, DING Wei-min, LI Yi-nian(刘璎瑛, 丁为民, 李毅念). Journal of Agricultural Engineering(农业工程学报), 2013(18): 262.
[4] WANG Wei-xing, LIU Yong-qiang(王卫星, 刘永强). Journal of Chongqing University of Posts and Telecommunications(重庆邮电大学学报·自然科学版), 2010, 22(2): 218.
[5] HUANG Xing-yi, WU Shou-yi, FANG Ru-ming, et al(黄星奕, 吴守一, 方如明, 等). Journal of Agricultural Engineering(农业工程学报), 2003, 19(3): 137.
[6] TAN Xiang, MAO Hai-ying, ZHI Xiao-dong, et al(谭 翔, 毛海颖, 支晓栋,等). Spectroscopy and Spectral Analysis(光谱学与光谱分析), 2018,38(2): 413.
[7] Li Haojie, Sun Fuming, Liu Lijuan,et al. Neurocomputing,2015,169: 77.
[8] Acharya U R, Fujita H, Bhat S, et al. Information Fusion, 2015,(29): 32.
[9] Rawat W, Wang Z H. Neural Computation, 2019, 29(9): 1.
[10] Du C B,Gao S S. IEEE Access, 2017, 5(1): 99.
[11] Liu Z,Blasch E,Xue Z Y, et al. IEEE Transactions on Pattern Analysis and Machine Intelligence,2012,34 (1): 94.
[12] Shelhamer E, Long J, Darrell T. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2014, 39(4): 640.