Estimation of Potato Plant Nitrogen Content Using UAV Multi-Source Sensor Information
FAN Yi-guang1, 3, 5, FENG Hai-kuan1, 2, 3*, LIU Yang1, 3, 4, BIAN Ming-bo1, 3, ZHAO Yu1, 3, YANG Gui-jun1, 3, QIAN Jian-guo5
1. Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
2. Nanjing Agricultural University, Nanjing 210095, China
3. National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
4. Key Laboratory of Modern Precision Agriculture System Integration Research, Ministry of Education, China Agricultural University, Beijing 100083, China
5. School of Geomatics, Liaoning Technical University, Fuxin 123000, China
Abstract:Acquiring the plant nitrogen content (PNC) information of crops quickly and accurately is the key to agricultural meticulous management and a research hotspot in the development of digital agriculture. In recent years, with the development of UAV and sensor technology, the use of various sensor information to monitor the physical and chemical parameters of crops has gradually attracted the attention of scholars at home and abroad. This study takes potato as the research object. Firstly, based on the UAV, the hyperspectral images and digital images of the potato budding stage, tuber formation stage, tuber growth stage, starch accumulation stage and maturity stage were obtained. At the same time, the digital camera was used to synchronously obtain the ground digital images of five growth periods, and the three-dimensional spatial coordinates of eleven ground control points (GCPs) and plant height (H), PNC were measured. Secondly, the digital surface model (DSM) of the test area was generated by using UAV digital images combined with GCPs. The accuracy of the extracted VCuav and Hdsm is verified by the calculated coverage (VC) of the digital image and the measured H. Then, the green edge parameters (GEPs) were calculated according to the hyperspectral images, and four fusion feature parameters (FFPs) of GEPs×Hdsm*VCuav, GEPs/(1+VCuav), (GEPs+VCuav)×Hdsm and GEPs/(1+Hdsm) were constructed, fusion of hyperspectral image information and digital image information. Finally, the correlation between GEPs extracted and FFPs constructed in each growth period with PNC were analyzed, and the PNC linear estimation models of five growth periods were constructed based on the optimal GEP and optimal FFP respectively. According to the GEPs and FFPs with high correlation, the multiple parameters estimation models of PNC were constructed by using partial least squares (PLSR) and artificial neural network (ANN). The results show that: (1) Hdsm and VCuav extracted from UAV digital images have high accuracy , which can replace the measured H and VC to estimation physical and chemical parameters (2) Compared with GEPs, most of the constructed FFPs have stronger correlation with PNC in the first four growth stages, and could better reflect the nitrogen nutrition status of potato. (3) Linear estimation models of potato PNC were constructed based on the optimal green edge parameter (OGEP) and the optimal fusion feature parameter (OFFP), respectively. The results showed that the effect of OFFP in estimating PNC was better than that of OGEP. (4) Compared with the single-parameter model, the accuracy and stability of the model constructed by using PLSR and ANN based on GEPs and FFPs are significantly improved. Among them, the models constructed with FFPs as the model factor have the best effect. (5) The ANN method is better than the PLSR method in estimating PNC in each growth period. Therefore, the fusion of the hyperspectral green edge parameters and the plant height and coverage information extracted by the high-definition digital camera sensor can improve the estimation accuracy of PNC, which provide a reference for the non-destructive dynamic monitoring of potato nitrogen nutrition status and the application of multi-source sensors information.
[1] Corti M, Cavalli D, Cabassi G, et al. Precision Agriculture, 2019, 20(4): 675.
[2] Zhang Y, Hui J, Qin Q, et al. Remote Sensing of Environment, 2021, 267: 112724.
[3] Suomalainen J, de Oliveira R A, Hakala T, et al. Remote Sensing of Environment, 2021, 266: 112691.
[4] NIJIAT Kasim, SHI Qing-dong, WANG Jing-zhe, et al(尼加提·卡斯木,师庆东,王敬哲,等). Transactions of the Chinese Society of Agricultural Engineering(农业工程学报), 2017, 33(22): 208.
[5] Zheng H, Cheng T, Zhou M, et al. Precision Agriculture, 2019, 20(3): 611.
[6] NIU Qing-lin, FENG Hai-kuan, YANG Gui-jun, et al(牛庆林, 冯海宽, 杨贵军, 等). Transactions of the Chinese Society of Agricultural Engineering(农业工程学报), 2018, 34(5): 73.
[7] Yue J, Yang G, Li C, et al. Remote Sensing, 2017, 9(7): 708.
[8] Kefauver S C, Vicente R, Vergara-Díaz O, et al. Frontiers in Plant Science, 2017, 8: 1733.
[9] Yue J, Feng H, Jin X, et al. Remote Sensing, 2018, 10(7): 1138.
[10] Xu X, Fan L, Li Z, et al. Remote Sensing, 2021, 13(3): 340.
[11] WANG Yu-na, LI Fen-ling, WANG Wei-dong, et al(王玉娜,李粉玲,王伟东,等). Transactions of the Chinese Society of Agricultural Engineering(农业工程学报), 2020, 36(22): 9.
[12] LIU Yang, FENG Hai-kuan, HUANG Jue, et al(刘 杨, 冯海宽, 黄 珏, 等). Spectroscopy and Spectral Analysis(光谱学与光谱分析), 2021, 41(3): 903.
[13] WU Zhi-chao, LI Chang-chun, FENG Hai-kuan, et al(吴智超, 李长春, 冯海宽, 等). Transactions of the Chinese Society for Agricultural Machinery(农业机械学报), 2020, 51(3): 7.
[14] Fu Y, Yang G, Li Z, et al. Remote Sensing, 2020, 12(22): 3778.
[15] Yao X, Ren H J, Cao Z S, et al. International Journal of Applied Earth Observation and Geoinformation, 2014, 32: 114.
[16] Han L, Yang G, Dai H, et al. Plant Methods, 2019, 15(1): 10.