|
|
|
|
|
|
Detection of Wheat Single Seed Vigor Using Hyperspectral Imaging and Spectrum Fusion Strategy |
SHI Rui1, 2, ZHANG Han2, WANG Cheng1, 2, KANG Kai2, LUO Bin1, 2* |
1. College of Agricultural Engineering, Jiangsu University, Zhenjiang 212000, China
2. Research Center of Intelligent Equipment, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
|
|
|
Abstract Wheat is a primary staple crop in China and is pivotal in the nation's economic development. Seeds form the foundation of all agricultural activities, with seed vigor being one of the most crucial evaluation indicators. Seeds with high vigor exhibit superior field performance and storage resilience. Thus, accurately identifying wheat seeds' vigor is paramount to China's agricultural production. Traditional seed vigor detection techniques are time-consuming, demand expertise, and can irreversibly damage the seeds. Previous attempts to detect seed vigor using hyperspectral imaging technology typically focused on batch testing of seeds, utilizing either image data or spectral data, but rarely combining both for single seed vigor detection. This study explores the potential of hyperspectral imaging technology for rapid, non-destructive detection of individual wheat seeds. A total of 210 manually aged wheat seeds (105 viable, 105 non-viable) were studied. Hyperspectral data within the seeds' 400~1 050 nm band were collected, followed by a standard germination test to ensure a one-to-one correspondence between the hyperspectral data and germination results. The dataset was divided into training, testing, and real datasets in a 4∶2∶1 ratio. The Competitive Adaptive Reweighted Sampling (CARS) algorithm was employed to select feature bands, resulting in 30 feature bands corresponding to seed nutrients like proteins, starch, and lipids influencing seed vigor. To identify the optimal classification model, prediction models for wheat seed vigor were established using support vector machine (SVM), k-nearestneighbor (KNN), one-dimensional convolutional neural network(1DCNN), and the improved ECA-CNN machine learning algorithms, based on both full-band and feature-band spectral data from the training and testing sets. The results indicated that models built using feature-band data outperformed those using full-band data. The ECA-CNN model, constructed with feature band data, exhibited the best performance, achieving an overall accuracy of 99.17% for the training and 80% for the testing sets. The overall method and pixel method classification strategies were compared using the real dataset to negate the influence of modeling processes on comparison strategies. The findings revealed that the pixel method surpassed the overall method in detection efficacy, with an overall accuracy of 86.67%, a precision of 92.31%, and a recall rate of 80%. This research offers theoretical support for the rapid, non-destructive detection of individual wheat seed vigor.
|
Received: 2023-09-08
Accepted: 2024-03-01
|
|
Corresponding Authors:
LUO Bin
E-mail: luob@nercita.org.cn
|
|
[1] CHEN Man, JIN Cheng-qian, MO Gong-wu, et al(陈 满,金诚谦,莫恭武,等). Transactions of the Chinese Society for Agricultural Machinery(农业机械学报), 2023, 54(2): 73.
[2] WANG Dong, WANG Kun, WU Jing-zhu, et al(王 冬,王 坤,吴静珠). Spectroscopy and Spectral Analysis(光谱学与光谱分析), 2021, 41(1): 52.
[3] Zhang H, Hou Q, Luo B, et al. Frontiers in Plant Science, 2022, 13: 1015891.
[4] Zhao X, Que H, Sun X, et al. Infrared Physics & Technology, 2022, 125: 104270.
[5] Wang Y, Xiong F, Zhang Y, et al. Food Chemistry, 2023, 404: 134503.
[6] Liu C, Huang W, Yang G, et al. Infrared Physics & Technology, 2020, 110: 103462.
[7] DING Zi-yu, YUE Xue-jun, ZENG Fan-guo, et al(丁子予,岳学军,曾凡国,等). Journal of Huazhong Agricultural University(华中农业大学学报), 2023, 42(3): 230.
[8] Wu N, Weng S, Chen J, et al. Computers and Electronics in Agriculture, 2022, 196: 106850.
[9] Yang Y, Chen J, He Y, et al. RSC Advances, 2020, 10(72): 44149.
[10] Jerry W, Workman Jr J, Weyer L. Practical Guide to Interpretive Near-Infrared Spectroscopy. Florida: CRC Press, 2007. 51.
[11] Zhou S, Sun L, Xing W, et al. Infrared Physics & Technology, 2020, 108: 103363.
[12] He X, Feng X, Sun D, et al. Molecules, 2019, 24(12): 2227.
[13] Balbino S, Vincek D, Trtanj I, et al. Foods, 2022, 11(6): 835.
[14] Lee J, Welti R, Roth M, et al. Plant Biotechnology Journal, 2012, 10(2): 164. |
[1] |
PENG Bo1, WEN Zhao-yang1, WEN Qi1, LIU Ting-ting1, 2*, XING Shuai3, WU Teng-fei3, YAN Ming1, 2*. Research of Mid-Infrared Time-Stretch Frequency Upconversion
Hyperspectral Imaging System[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(11): 3037-3042. |
[2] |
WENG Shi-zhuang, PAN Mei-jing, TAN Yu-jian, ZHANG Qiao-qiao, ZHENG Ling*. Prediction of Soluble Solid Content in Apple Using Image Spectral Super-Resolution[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(11): 3095-3100. |
[3] |
JIA Tong-hua1, CHENG Guang-xu1*, YANG Jia-cong1, CHEN Sheng2, WANG Hai-rong3, HU Hai-jun1. Research of Chlorine Concentration Inversion Method Based on 1D-CNN Using Ultraviolet Spectral[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(11): 3109-3119. |
[4] |
DENG Yun1, 2, WU Wei1, 2, SHI Yuan-yuan3, CHEN Shou-xue1, 2*. Red Soil Organic Matter Content Prediction Model Based on Dilated
Convolutional Neural Network[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(10): 2941-2952. |
[5] |
LIU Yu-juan1, 2, 3, LIU Yan-da1, 2, 3, YAN Zhen1, 4, ZHANG Zhi-yong1, 2, 3, CAO Yi-ming1, 2, 3, SONG Ying1, 2, 3*. Classification of Hybrid Convolution Hyperspectral Images Based on
Attention Mechanism[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(10): 2916-2922. |
[6] |
NIU Jie-qiong1, REN Hong-rui1, ZHOU Guang-sheng2*. Spectral Characteristics Analysis of Different Vegetation Subformations on the Qinghai-Tibet Plateau[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(09): 2638-2646. |
[7] |
KONG Li-qin1, 2, NIU Xiao-hu1, 2, WANG Cheng-lei1, 2, FENG Yao-ze1, 2, 3*, ZHU Ming1, 2. Application of Hyperspectral Imaging Technology in the Identification of Composite Adulteration Type in Beef Balls[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(08): 2183-2191. |
[8] |
WANG Hao-yu1, 2, 3, WEI Zi-yuan1, 2, 3, YANG Yong-xia1, 2, 3, HOU Jun-ying1, 2, 3, SUN Zhang-tong1, 2, 3, HU Jin1, 2, 3*. Estimation of Eggplant Leaf Nitrogen Content Based on Hyperspectral Imaging and Convolutional Auto-Encoders Networks[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(08): 2208-2215. |
[9] |
LI Hao1, YU Hao1, CAO Yong-yan1, HAO Zi-yuan1, 2, YANG Wei1, 2*, LI Min-zan1, 2. Hyperspectral Prediction of Soil Organic Matter Content Using
CARS-CNN Modelling[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(08): 2303-2309. |
[10] |
CHEN Jun-jie1, YU Quan-zhou1, 2*, TANG Qing-xin1, LIANG Tian-quan1, JIANG Jie1, ZHANG Hong-li1. Hyperspectral Differences Between New and Old Leaves of Dominant Tree Species in Changbai Mountain[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(08): 2372-2380. |
[11] |
HAN Bo-chong1, 2, SONG Yi-han1, 2*, ZHAO Yong-heng1, 2. Classification of Star Spectrum Based on Multi-Scale Feature Fusion[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(08): 2284-2288. |
[12] |
LI Rong1, CAO Guan-long1*, PU Yuan2*, QIU Bo1, WANG Xiao-min1, YAN Jing1, WANG Kun1. TDSC-Net: A Two-Dimensional Stellar Spectra Classification Model Based on Attention Mechanism and Feature Fusion[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(07): 1968-1973. |
[13] |
ZHAO Jia-le1, WANG Guang-long1, ZHOU Bing1*, YING Jia-ju1, LIU Jie1, LIN Chao2, CHEN Qi1, ZHAO Run-ze3. Target Detection Algorithm for Land-Based Hyperspectral Images
Associated With Geospatial Data[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(07): 2056-2065. |
[14] |
YU Shui1, HUAN Ke-wei1*, LIU Xiao-xi2, WANG Lei1. Quantitative Analysis Modeling of Near Infrared Spectroscopy With
Parallel Convolution Neural Network[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(06): 1627-1635. |
[15] |
LI Hao1, ZHAO Qing1*, CUI Chen-zhou2, FAN Dong-wei2, ZHANG Cheng-kui1, SHI Yan-cui1, WANG Yuan1. A Stellar Spectrum Classification Algorithm Based on CNN and LSTM Composite Deep Learning Model[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(06): 1668-1675. |
|
|
|
|