Open Access iconOpen Access

ARTICLE

crossmark

Modeling and Estimating Soybean Leaf Area Index and Biomass Using Machine Learning Based on Unmanned Aerial Vehicle-Captured Multispectral Images

Sadia Alam Shammi1,2, Yanbo Huang1,*, Weiwei Xie1,2, Gary Feng1, Haile Tewolde1, Xin Zhang3, Johnie Jenkins1, Mark Shankle4

1 USDA-ARS Genetics and Sustainable Agriculture Research Unit, Mississippi, MS 39762, USA
2 Oak Ridge Institute for Science and Education, Oak Ridge, TN 37831, USA
3 Department of Agricultural and Biological Engineering, Mississippi State University, Mississippi State, MS 39762, USA
4 Pontotoc Ridge-Flatwoods Branch Experiment Station, Mississippi State University, Pontotoc, Pontotoc, MS 38863, USA

* Corresponding Author: Yanbo Huang. Email: email

(This article belongs to the Special Issue: Application of Digital Agriculture and Machine Learning Technologies in Crop Production)

Phyton-International Journal of Experimental Botany 2025, 94(9), 2745-2766. https://doi.org/10.32604/phyton.2025.068955

Abstract

Crop leaf area index (LAI) and biomass are two major biophysical parameters to measure crop growth and health condition. Measuring LAI and biomass in field experiments is a destructive method. Therefore, we focused on the application of unmanned aerial vehicles (UAVs) in agriculture, which is a cost and labor-efficient method. Hence, UAV-captured multispectral images were applied to monitor crop growth, identify plant bio-physical conditions, and so on. In this study, we monitored soybean crops using UAV and field experiments. This experiment was conducted at the MAFES (Mississippi Agricultural and Forestry Experiment Station) Pontotoc Ridge-Flatwoods Branch Experiment Station. It followed a randomized block design with five cover crops: Cereal Rye, Vetch, Wheat, MC: mixed Mustard and Cereal Rye, and native vegetation. Planting was made in the fall, and three fertilizer treatments were applied: Synthetic Fertilizer, Poultry Litter, and none, applied before planting the soybean, in a full factorial combination. We monitored soybean reproductive phases at R3 (initial pod development), R5 (initial seed development), R6 (full seed development), and R7 (initial maturity) and used UAV multispectral remote sensing for soybean LAI and biomass estimations. The major goal of this study was to assess LAI and biomass estimations from UAV multispectral images in the reproductive stages when the development of leaves and biomass was stabilized. We made about fourteen vegetation indices (VIs) from UAV multispectral images at these stages to estimate LAI and biomass. We modeled LAI and biomass based on these remotely sensed VIs and ground-truth measurements using machine learning methods, including linear regression, Random Forest (RF), and support vector regression (SVR). Thereafter, the models were applied to estimate LAI and biomass. According to the model results, LAI was better estimated at the R6 stage and biomass at the R3 stage. Compared to the other models, the RF models showed better estimation, i.e., an R2 of about 0.58–0.68 with an RMSE (root mean square error) of 0.52–0.60 (m2/m2) for the LAI and about 0.44–0.64 for R2 and 21–26 (g dry weight/5 plants) for RMSE of biomass estimation. We performed a leave-one-out cross-validation. Based on cross-validated models with field experiments, we also found that the R6 stage was the best for estimating LAI, and the R3 stage for estimating crop biomass. The cross-validated RF model showed the estimation ability with an R2 about 0.25–0.44 and RMSE of 0.65–0.85 (m2/m2) for LAI estimation; and R2 about 0.1–0.31 and an RMSE of about 28–35 (g dry weight/5 plants) for crop biomass estimation. This result will be helpful to promote the use of non-destructive remote sensing methods to determine the crop LAI and biomass status, which may bring more efficient crop production and management.

Keywords

Soybean; LAI; biomass; reproductive growth stage; UAV multispectral imaging; machine learning

Cite This Article

APA Style
Shammi, S.A., Huang, Y., Xie, W., Feng, G., Tewolde, H. et al. (2025). Modeling and Estimating Soybean Leaf Area Index and Biomass Using Machine Learning Based on Unmanned Aerial Vehicle-Captured Multispectral Images. Phyton-International Journal of Experimental Botany, 94(9), 2745–2766. https://doi.org/10.32604/phyton.2025.068955
Vancouver Style
Shammi SA, Huang Y, Xie W, Feng G, Tewolde H, Zhang X, et al. Modeling and Estimating Soybean Leaf Area Index and Biomass Using Machine Learning Based on Unmanned Aerial Vehicle-Captured Multispectral Images. Phyton-Int J Exp Bot. 2025;94(9):2745–2766. https://doi.org/10.32604/phyton.2025.068955
IEEE Style
S. A. Shammi et al., “Modeling and Estimating Soybean Leaf Area Index and Biomass Using Machine Learning Based on Unmanned Aerial Vehicle-Captured Multispectral Images,” Phyton-Int. J. Exp. Bot., vol. 94, no. 9, pp. 2745–2766, 2025. https://doi.org/10.32604/phyton.2025.068955



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 875

    View

  • 311

    Download

  • 0

    Like

Share Link