Combining UAV Digital Imagery With PROSAIL Modeling for LAI
Inversion in Summer Maize
NIU Qing-lin1, 4, ZHANG He-bing1*, DENG Jiong2, FENG Hai-kuan3, LI Chang-chun1, YANG Gui-jun3, CHEN Zhi-chao1
1. School of Surveying and Land Information Engineering, Henan Polytechnic University, Jiaozuo 454000, China
2. Henan Provincial Institute of Land and Space Investigation and Planning, Zhengzhou 450000, China
3. Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
4. Farmland Irrigation Research Institute, Chinese Academy of Agricultural Sciences, Xinxiang 453002, China
Abstract:The leaf area index (LAI) is an important growth indicator that reflects various maize characteristics and can effectively assist in selecting and breeding new maize varieties. The rapid, non-destructive, and accurate determination of maize LAI is very important in maize breeding. At present, unscrewed aerial vehicle (UAV) visible light remote sensing technology has been rapidly developed in applications for obtaining phenotypic information such as crop LAI because of its advantages in obtaining spatial information about crops in the field in a rapid, non-destructive and high-throughput manner; However, the presence of spectral saturation, due to the lack of information on the response mechanism between spectral parameters and phenotypic information, limits the further improvement of the accuracy of models for estimating phenotypic information; Obviously, the PROSAIL radiative transfer model has the advantage of simulating the response mechanism between crop physicochemical parameters and spectral index parameters, which can effectively enhance the potential of crop physicochemical parameter inversion. Therefore, this study combined UAV digital imagery with the PROSAIL model to invert summer maize LAI to further improve the accuracy of the LAI inversion model. Taking summer maize in the maize breeding experimental field as the research object, a UAV remote sensing system was used to obtain high-resolution digital images at the jointing stage, trumpet stage and tassel emergence stage and combined with the PROSAIL model to construct a summer maize LAI inversion model using partial least squares regression (PLSR), random forest regression (RFR) and convolutional neural network (CNN) regression methods. The results show that (1) based on UAV high-resolution digital images, the model constructed by PLSR regression method has the optimal accuracy, and the R2, RMSE and nRMSE of the estimation model and validation model are 0.69, 0.37, 24.28% and 0.73, 0.35, 23.26%, respectively; (2) Based on the PROSAIL model, the model constructed using the RFR regression method has the best accuracy, with R2, RMSE and nRMSE of 0.98, 0.28, 6.88% and 0.87, 0.64, 15.97% for the estimated and validated models, respectively; (3) Combining the UAV high-resolution digital imagery with the PROSAIL model, the RFR regression method constructed the model with optimal accuracy, and the R2, RMSE and nRMSE of the estimation and validation models were 0.98, 0.27, 7.07% and 0.87, 0.65, 16.35%, respectively. The nRMSE of the optimal estimation model and the validation model were reduced by 17.21% and 6.91%, respectively, compared to using only UAV high-resolution digital imagery. The study shows that combining UAV digital imagery with the PROSAIL model effectively improves the accuracy and stability of the LAI inversion model for summer maize, and provides theoretical guidance to assist in selecting and breeding new maize varieties.
Key words:Unmanned aerial vehicle; Summer maize; Digital imagery; PROSAIL model; Leaf area index; Random forest
牛庆林,张合兵,邓 炯,冯海宽,李长春,杨贵军,陈志超. 结合无人机数码影像与PROSAIL模型的夏玉米LAI反演[J]. 光谱学与光谱分析, 2025, 45(08): 2335-2347.
NIU Qing-lin, ZHANG He-bing, DENG Jiong, FENG Hai-kuan, LI Chang-chun, YANG Gui-jun, CHEN Zhi-chao. Combining UAV Digital Imagery With PROSAIL Modeling for LAI
Inversion in Summer Maize. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2025, 45(08): 2335-2347.
[1] Feng L, Chen S S, Zhang C, et al. Computers and Electronics in Agriculture, 2021, 182: 106033.
[2] ZHANG Zhen-bo, JIA Chun-lan, REN Bai-zhao, et al(张振博, 贾春兰, 任佰朝, 等). Acta Agronomica Sinica(作物学报), 2023, 49(6): 1616.
[3] LIU Yang, FENG Hai-kuan, SUN Qian, et al(刘 杨, 冯海宽, 孙 乾, 等). Spectroscopy and Spectral Analysis(光谱学与光谱分析), 2021, 41(5): 1470.
[4] NIU Qing-lin, FENG Hai-kuan, YANG Gui-jun, et al(牛庆林, 冯海宽, 杨贵军, 等). Transactions of the Chinese Society of Agricultural Engineering(农业工程学报), 2018, 34(5): 73.
[5] CAO Zhong-sheng, LI Yan-da, HUANG Jun-bao, et al(曹中盛, 李艳大, 黄俊宝, 等). Chinese Journal of Rice Science(中国水稻科学), 2022, 36(3): 308.
[6] JIANG Jie, ZHANG Ze-yu, CAO Qiang, et al(江 杰, 张泽宇, 曹 强, 等). Journal of Nanjing Agricultural University(南京农业大学学报), 2019, 42(4): 622.
[7] Jacquemoud S, Verhoef W, Baret F, et al. Remote Sensing of Environment, 2009, 113: S56.
[8] Adeluyi O, Harris A, Verrelst J, et al. International Journal of Applied Earth Observation and Geoinformation, 2021, 102: 102454.
[9] Jiang H Y, Wei X Q, Chen Z L, et al. Computers and Electronics in Agriculture, 2023, 212: 108165.
[10] Tucker C J. Remote Sensing of Environment, 1979, 8(2): 127.
[11] Meyer G E, Hindman T, Laksmi K. SPIE, 1999, 3543: 327.
[12] Woebbecke D M, Meyer G E, Bargen K V, et al. Transactions of the ASAE, 1995, 38(1): 259.
[13] Meyer G E, Neto J C. Computers and Electronics in Agriculture, 2008, 63(2): 282.
[14] Hague T, Tillett N D, Wheeler H. Precision Agriculture, 2006, 7: 21.
[15] Kataoka T, Kaneko T, Okamoto H, et al. Crop Growth Estimation System Using Machine Vision, Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Machatronics, 2003.
[16] Guijarro M, Pajares G, Riomoros I, et al. Computers and Electronics in Agriculture, 2011, 75(1): 75.
[17] Guerrero J M, Pajares G, Montalvo M, et al. Expert Systems with Applications, 2012, 39(12): 11149.
[18] Louhaichi M, Borman M M, Johnson D E. Geocarto International, 2001, 16(1): 65.
[19] Gitelson A A, Kaufman Y J, Stark R, et al. Remote Sensing of Environment, 2002, 80(1): 76.
[20] Bendig J, Yu K, Aasen H, et al. International Journal of Applied Earth Observation and Geoinformation, 2015, 39: 79.
[21] Guo Y H, Xiao Y, Li M W, et al. International Journal of Applied Earth Observation and Geoinformation, 2022, 115: 103121.
[22] Guo Y H, Wang H X, Wu Z F, et al. Sensors, 2020, 20(18): 5055.
[23] Kazmi W, Garcia-Ruiz F J, Nielsen J, et al. Computers and Electronics in Agriculture, 2015, 112: 10.
[24] Kawashima S, Nakatani M. Annals of Botany, 1998, 81: 49.
[25] Metternicht G. International Journal of Remote Sensing, 2003, 24(14): 2855.
[26] Gamon J A, Surfus J S. New Phytologist, 1999, 143(1): 105.
[27] Roosjen P P J, Brede B, Suomalainen J M, et al. International Journal of Applied Earth Observation and Geoinformation, 2018, 66: 14.
[28] de Wit C T. Photosynthesis of Leaf Canopies, Agricultural Research Reports,1965.
[29] Su W, Huang J X, Liu D S, et al. Remote Sensing, 2019, 11(5): 572.
[30] Wan L, Zhu J P, Du X Y, et al. Journal of Experimental Botany, 2021, 72(13): 4691.
[31] Lehnert L W, Meyer H, Obermeier W a, et al. Journal of Statistical Software, 2019, 89(12): 1.
[32] Yue J B, Yang H, Feng H K, et al. Computers and Electronics in Agriculture, 2023, 211: 108011.
[33] ZHANG Dong-yan, HAN Xuan-xuan, LIN Fen-fang, et al(张东彦, 韩宣宣, 林芬芳, 等). Transactions of the Chinese Society of Agricultural Engineering(农业工程学报), 2022, 38(9): 171.
[34] Sun B, Wang C F, Yang C H, et al. International Journal of Applied Earth Observation and Geoinformation, 2021, 102: 102373.