AI-Powered Hyperspectral Imaging Enhances Sorghum Nutrient Detection with Machine Learning

Researchers have developed a non-destructive method to detect the nutritional composition of sorghum, including crude protein, tannin, and crude fat, using visible near-infrared (VIS-NIR) hyperspectral imaging (HSI).

Sorghum field in evening sunlight, at the end of a summer
Study: Using visible and NIR hyperspectral imaging and machine learning for nondestructive detection of nutrient contents in sorghum. Image Credit: Callahan/Shutterstock.com

Published in Nature, the study employed competitive adaptive reweighted sampling (CARS), bootstrapping soft shrinkage (BOSS), and iteratively retains informative variables (IRIV) algorithms to select key wavelength variables. Detection models were then built using partial least squares (PLS), backpropagation (BP), and extreme learning machine (ELM) methods. The findings demonstrate that VIS-NIR spectroscopy is an effective tool for assessing sorghum’s nutritional content, with significant potential applications in food and agriculture industries.

Background

Sorghum is a key cereal crop widely cultivated for food, animal feed, and brewing due to its rich nutritional profile, including protein, tannin, and fat. Accurately measuring these components is essential for quality control and utilization. Traditional chemical analysis methods, while accurate, are time-consuming, costly, and destructive, making them impractical for large-scale assessment. This has driven the need for more efficient, non-destructive alternatives.

Hyperspectral imaging (HSI) has emerged as a valuable tool for rapid, multi-component analysis in grains, including maize, rice, and wheat. While previous studies have explored near-infrared and mid-infrared spectroscopy for quality assessment, efficient detection methods, specifically for sorghum’s crude protein, tannin, and fat, remain limited.

This study addresses that gap by leveraging VIS-NIR hyperspectral imaging alongside advanced machine learning algorithms to develop predictive models for sorghum’s nutritional content. By refining spectral feature selection, the researchers improved detection accuracy, providing a fast, scalable, and non-destructive solution for sorghum quality assessment.

Materials and Methodology

The study analyzed 93 sorghum varieties from the Sorghum Research Institute of Shanxi Agricultural University to evaluate their nutritional composition. Samples were collected, sealed for storage, and subjected to hyperspectral imaging and chemical analysis.

A VIS-NIR HSI platform (Headwall Photonics, USA) was used to capture images, with a spectral range of 430-900 nanometers (nm) to minimize measurement errors. A custom preprocessing platform developed in Visual Basic facilitated efficient spectral data extraction and processing. To enhance data reliability, preprocessing techniques such as standard normal variate (SNV), detrending, and multiplicative scatter correction (MSC) were applied.

After spectral data collection, the samples were pulverized and analyzed using established chemical methods. Crude protein was determined using the Kjeldahl method, tannin content via ferric ammonium citrate colorimetry, and crude fat using a Soxhlet extractor. All measurements were performed in triplicate to ensure accuracy.

To build robust detection models, the dataset was split into calibration and prediction sets using the SPXY algorithm (3:1 ratio). Three feature selection algorithms—CARS, BOSS, and IRIV—were employed to refine spectral features. The performance of the PLS, BP neural network, and ELM models was evaluated using the coefficient of determination (R2), root mean squared error (RMSE), and relative prediction deviation (RPD). Multiple iterations ensured reliable results.

Findings and Analysis

A total of 279 sorghum samples from 93 varieties were analyzed for crude protein, tannin, and crude fat content. The average values were 9.45 %, 1.15 %, and 3.88 %, respectively, with significant variations among different varieties. VIS-NIR spectral analysis distinguished sorghum varieties based on reflectance differences, particularly in the visible spectrum, which is influenced by seed color.

Spectral preprocessing techniques, including SNV, detrending, and MSC, effectively reduced noise and improved model accuracy. To refine wavelength selection, CARS and BOSS algorithms identified key spectral ranges, which were further optimized using the IRIV algorithm. This approach eliminated non-informative variables while retaining critical characteristic wavelengths.

Key spectral ranges were identified for each nutrient component:

  • Crude protein: 434–899 nm, linked to amino acid interactions and molecular vibrations.
  • Tannin: 475–897 nm, associated with molecular transitions and structural interactions.
  • Crude fat: 447–900 nm, corresponding to fatty acid bonds and vibrational modes.

By optimizing spectral data and reducing unnecessary variables, the methodology preserved essential features for improved modeling and chemical composition prediction. The study highlighted the importance of weak characteristic wavelengths in building accurate detection models.

Conclusion

This research successfully employed VIS-NIR HSI and machine learning to non-destructively detect crude protein, tannin, and crude fat in sorghum. Using CARS, BOSS, and IRIV for wavelength selection, the study developed PLS, BP, and ELM models, with the BOSS-IRIV-ELM model delivering the best performance. The findings contribute to real-time quality assessment applications in food and agriculture.

Future work will focus on integrating artificial intelligence (AI) and big data techniques to develop a comprehensive detection system, enabling simultaneous analysis of multiple nutritional components in grains during industrial processing.

Journal Reference

Wu, K., Zhang, Z., He, X., Li, G., Zheng, D., & Li, Z. (2025). Using visible and NIR hyperspectral imaging and machine learning for nondestructive detection of nutrient contents in sorghum. Scientific Reports15(1). DOI:10.1038/s41598-025-90892-6. https://www.nature.com/articles/s41598-025-90892-6

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Nandi, Soham. (2025, February 25). AI-Powered Hyperspectral Imaging Enhances Sorghum Nutrient Detection with Machine Learning. AZoRobotics. Retrieved on February 25, 2025 from https://www.azorobotics.com/News.aspx?newsID=15745.

  • MLA

    Nandi, Soham. "AI-Powered Hyperspectral Imaging Enhances Sorghum Nutrient Detection with Machine Learning". AZoRobotics. 25 February 2025. <https://www.azorobotics.com/News.aspx?newsID=15745>.

  • Chicago

    Nandi, Soham. "AI-Powered Hyperspectral Imaging Enhances Sorghum Nutrient Detection with Machine Learning". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=15745. (accessed February 25, 2025).

  • Harvard

    Nandi, Soham. 2025. AI-Powered Hyperspectral Imaging Enhances Sorghum Nutrient Detection with Machine Learning. AZoRobotics, viewed 25 February 2025, https://www.azorobotics.com/News.aspx?newsID=15745.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.