Researchers use advanced remote sensing and machine-learning algorithms to quickly monitor crop nitrogen levels, central to informing sustainable agriculture.
Powerful airborne sensors could be key in helping farmers sustainably manage maize across the US Corn Belt, according to a University of Illinois research team. The study, which employs remote sensors combined with newly developed deep learning models, gives an accurate and speedy prediction of crop nitrogen, chlorophyll, and photosynthetic capacity.
Published in the International Journal of Applied Earth Observation and Geoinformation, the work could guide farmer management practices, helping reduce fertilizer use, boost food production, and alleviate environmental damage across the region.
“Compared to the conventional approaches of leaf tissue analysis, remote sensing provides much faster and more cost-effective approaches to monitor crop nutrients. The timely and high-resolution crop nitrogen information will be very helpful to growers to diagnose crop growth and guide adaptive management,” said lead author Sheng Wang, a research scientist and assistant professor at the University of Illinois Urbana-Champaign.
Producing about 75% of corn in the US and 30% globally, the Corn Belt plays a major role in food production. Extending from Indiana to Nebraska, the region yields 20 times more than it did in the 1880s, thanks to improved farming, corn breeding, new technologies, and fertilizers.
Farmers rely on nitrogen-based fertilizers to boost photosynthesis, crop yields, and biomass for bioenergy crops. However, excessive application degrades soil, pollutes water sources, and contributes to global warming—nitrogen represents one of the largest sources of greenhouse gas emissions in agriculture.
Accurately measuring nitrogen levels in crops could help farmers avoid over application, but manually conducting surveys is time-consuming and labor-intensive.
“Precision agriculture that relies on advanced sensing technologies and airborne satellite platforms to monitor crops could be the solution,” said project lead Kaiyu Guan, the Blue Waters Associate Professor at the University of Illinois Urbana-Champaign.
Up until now, there has not been a reliable method for quickly monitoring leaf nitrogen levels over the course of a growing season. Using hyperspectral imaging and machine-learning models, the team proposed a hybrid approach to address these limitations.
Hyperspectral imaging—an expanding area of remote sensing—uses a spectrometer that breaks down a pixel into hundreds of images at different wavelengths, providing more information on the captured image.
Equipped with a highly sensitive hyperspectral sensor, the researchers conducted plane surveys over an experimental field in Illinois, collecting crop reflectance data. Plant chemical composition, such as nitrogen and chlorophyll influences reflection, which the sensors can detect even in subtle wavelength changes of just 3 to 5 nanometers.
Using Radiative Transfer Modeling and a data-driven Partial-Least Squares Regression (PLSR) approach, the team developed deep learning models to predict crop traits based on the airborne reflectance data. According to the study, PLSR requires a relatively small size of label data for model training.
The researchers trained their deep learning models using cuDNN and NVIDIA V100 GPUs to predict crop nitrogen, chlorophyll, and photosynthetic capacity at both leaf and canopy levels.
Testing the algorithms against ground-truth data, the models were about 85% accurate. The technique is fast, scanning fields in just a few seconds per acre. According to Wang, such technology can be very helpful to diagnose crop nitrogen status and yield potential.
The ultimate goal of the work is to use satellite imagery for large-scale nitrogen monitoring across every field in the US Corn Belt and beyond.
“We hope this technology can provide stakeholders timely information and advance growers’ management practices for sustainable agricultural practices,” Guan said.
Read the study in the International Journal of Applied Earth Observation and Geoinformation. >>
Read more. >>