Posted in | News | Agricultural Robotics

CES-YOLOv8 Revolutionizes Strawberry Ripeness Detection

In a recent article published in the journal Agronomy, researchers introduced CES-YOLOv8 (cross-entropy selection you only look once version 8), a novel deep learning algorithm designed to accurately detect the ripeness/maturity of strawberries. Their goal was to enhance the accuracy and reliability of ripeness recognition while enabling real-time processing.

CES-YOLOv8 Revolutionizes Strawberry Ripeness Detection
Study: CES-YOLOv8 Revolutionizes Strawberry Ripeness Detection. Image Credit: Taras Garkusha/Shutterstock.com

Background

The global population is steadily increasing, while arable land is becoming increasingly scarce. This necessitates a significant increase in agricultural production, which can be achieved through smart agriculture. Smart agriculture employs digital information technology and intelligent equipment to optimize farming practices, promoting efficient and sustainable development.

Automated harvesting robots are a key component of smart agriculture. They can replace manual labor, significantly increasing harvesting efficiency, especially in regions facing high labor costs or shortages. Fruit maturity detection plays a crucial role in the efficient and accurate operation of automated harvesting robots.

Traditional methods for detecting fruit maturity typically rely on color and size recognition. These methods often prove ineffective due to variations in growing conditions and the complex color changes during fruit ripening.

Previous research has explored various solutions to improve fruit maturity detection. However, many of these approaches face challenges with environmental dependency and struggle to balance accuracy and efficiency. Therefore, there is a need for a robust and reliable algorithm capable of extracting deep features from fruit images and classifying them based on their ripeness level.

About the Research

In this paper, the authors focused on strawberries as their research subject and developed CES-YOLOv8, an improved version of the YOLOv8 algorithm. While YOLOv8 is known for its accuracy and speed in object detection, it struggles to capture detailed features of fruits at different ripeness levels. To address this, the researchers made three key modifications.

Firstly, some coordinate-to-features (C2f) modules in the YOLOv8 backbone layer were replaced with ConvNeXt version 2 (ConvNeXt V2) modules to improve the feature diversity and computational efficiency. Then an efficient channel attention (ECA) attention mechanism was added above the spatial pyramid feature fusion (SPFF) layer to enhance feature representation and attention learning of the network. Lastly, the authors employed a smoothed intersection over union (SIoU) function to improve IoU and the accuracy of the prediction box localization.

The researchers collected images of strawberries under various lighting conditions, levels of occlusion, and angles. They augmented these images to increase the dataset’s size and diversity. Additionally, they annotated the strawberries with bounding boxes and labels according to their ripeness labels, which were classified into four levels based on the fruit's color change. The dataset was divided into training and test sets in a 4:1 ratio.

The improved CES-YOLOv8 model was then trained and evaluated on a computer with a graphical processing unit (GPU). Performance metrics such as precision, recall, mean average precision at 50 % (mAP50), F1 score, and frames per second (FPS) were used to assess the model's effectiveness.

Research Findings

The experimental outcomes demonstrated that the improved CES-YOLOv8 model significantly enhanced strawberry ripeness detection. The model achieved an accuracy of 88.20 %, a recall of 89.80 %, a mAP50 of 92.10 %, and an F1 score of 88.99 %. These metrics represented improvements of 4.8  %, 2.9 %, 2.05 %, and 3.88 %, respectively, over the original YOLOv8 model.

Additionally, CES-YOLOv8 achieved a high FPS of 184.52, indicating real-time image processing capabilities without compromising accuracy. It successfully detected strawberries of different ripeness levels, even in complex scenarios with multiple targets, occlusions, small objects, and severe obstructions.

Furthermore, the authors conducted ablation studies to assess the impact of each modification. They compared CES-YOLOv8 with other popular models, including faster region convolutional neural network (Faster-R-CNN), retina network (RetinaNet), YOLO version 5 (YOLOv5), YOLO version 8 (YOLOv7), and the original YOLOv8. The improved model outperformed these algorithms in terms of precision, recall, mAP50, and F1 score. It also reduced issues related to missed and duplicate detections, confirming the effectiveness of the implemented enhancements.

Applications

This research offers efficient and accurate ripeness detection technology for automated harvesting robots in smart agriculture. It can significantly boost harvesting efficiency, decrease labor costs, and maintain the quality and market value of strawberries. The CES-YOLOv8 model can also be adapted for other fruit crops like tomatoes, bananas, and coconuts. Furthermore, it can be integrated with other smart agricultural systems, including precision farming, automated harvesting, and sorting technologies, enhancing the overall productivity and sustainability of agricultural operations.

Conclusion

The novel CES-YOLOv8 approach proved highly effective for smart farming, particularly in detecting strawberries at various ripeness levels. This technology can support automated harvesting robots and potentially be extended to other fruit crops and agricultural applications. Moving forward, the researchers acknowledged some limitations. They suggested verifying the model’s effectiveness on other fruits and crops and recommended exploring its adaptability and robustness in different climatic conditions and against pest impacts.

Journal Reference

Chen, Y.; Xu, H.; Chang, P.; Huang, Y.; Zhong, F.; Jia, Q.; Chen, L.; Zhong, H.; Liu, S. CES-YOLOv8: Strawberry Maturity Detection Based on the Improved YOLOv8. Agronomy 2024, 14, 1353. DOI: 10.3390/agronomy14071353

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Muhammad Osama

Written by

Muhammad Osama

Muhammad Osama is a full-time data analytics consultant and freelance technical writer based in Delhi, India. He specializes in transforming complex technical concepts into accessible content. He has a Bachelor of Technology in Mechanical Engineering with specialization in AI & Robotics from Galgotias University, India, and he has extensive experience in technical content writing, data science and analytics, and artificial intelligence.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Osama, Muhammad. (2024, July 05). CES-YOLOv8 Revolutionizes Strawberry Ripeness Detection. AZoRobotics. Retrieved on October 30, 2024 from https://www.azorobotics.com/News.aspx?newsID=15051.

  • MLA

    Osama, Muhammad. "CES-YOLOv8 Revolutionizes Strawberry Ripeness Detection". AZoRobotics. 30 October 2024. <https://www.azorobotics.com/News.aspx?newsID=15051>.

  • Chicago

    Osama, Muhammad. "CES-YOLOv8 Revolutionizes Strawberry Ripeness Detection". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=15051. (accessed October 30, 2024).

  • Harvard

    Osama, Muhammad. 2024. CES-YOLOv8 Revolutionizes Strawberry Ripeness Detection. AZoRobotics, viewed 30 October 2024, https://www.azorobotics.com/News.aspx?newsID=15051.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.