2026
1.

Ranjan Sapkota; Zhichao Meng; Martin Churuvija; Xiaoqiang Du; Zenghong Ma; Manoj Karkee
In: Agriculture Communications, vol. 4, no. 1, pp. 100125, 2026, ISSN: 2949-7981.
Abstract | Links | BibTeX | Tags: agricultural automation, Fruitlet detection, Object detection, YOLO comparison, You Only Look Once (YOLO)
@article{sapkota_comprehensive_2026,
title = {Comprehensive performance evaluation of YOLOv12, YOLO11, YOLOv10, YOLOv9 and YOLOv8 on detecting and counting fruitlet in complex orchard environments},
author = {Ranjan Sapkota and Zhichao Meng and Martin Churuvija and Xiaoqiang Du and Zenghong Ma and Manoj Karkee},
url = {https://www.sciencedirect.com/science/article/pii/S2949798126000050},
doi = {https://doi.org/10.1016/j.agrcom.2026.100125},
issn = {2949-7981},
year = {2026},
date = {2026-01-01},
urldate = {2026-01-01},
journal = {Agriculture Communications},
volume = {4},
number = {1},
pages = {100125},
abstract = {This study systematically conducted an extensive real-world evaluation of all configurations of You Only Look Once (YOLO)-based object detection algorithms, including YOLOv8, YOLOv9, YOLOv10, YOLO11, and YOLOv12. Models were assessed using precision, recall, mean Average Precision at 50 % Intersection over Union (mAP@50), and computational efficiency across pre-processing, inference, and post-processing stages for detecting immature green fruitlets in commercial orchards. Field-level fruitlet counting was also validated using images captured with both Intel RealSense and iPhone 14 Pro Max sensors. YOLOv12l achieved the highest recall (0.900), while YOLOv10x and YOLOv9 GELAN-c reported the top precision scores of 0.908 and 0.903, respectively. YOLOv9 GELAN-base and GELAN-e achieved the highest mAP@50 (0.935), followed by YOLO11s (0.933) and YOLOv12l (0.931). In counting validation, YOLO11n demonstrated superior accuracy, with RMSE values of 4.51\textendash4.96 and MAE values of 3.85\textendash7.73 across four apple varieties. Sensor-specific training on Intel RealSense further improved detection performance. YOLO11n also recorded the fastest inference speed (2.4 ms), outperforming YOLOv8n, YOLOv9 GELAN-s, YOLOv10n, and YOLOv12n, affirming its suitability for real-time orchard applications.},
keywords = {agricultural automation, Fruitlet detection, Object detection, YOLO comparison, You Only Look Once (YOLO)},
pubstate = {published},
tppubtype = {article}
}
This study systematically conducted an extensive real-world evaluation of all configurations of You Only Look Once (YOLO)-based object detection algorithms, including YOLOv8, YOLOv9, YOLOv10, YOLO11, and YOLOv12. Models were assessed using precision, recall, mean Average Precision at 50 % Intersection over Union (mAP@50), and computational efficiency across pre-processing, inference, and post-processing stages for detecting immature green fruitlets in commercial orchards. Field-level fruitlet counting was also validated using images captured with both Intel RealSense and iPhone 14 Pro Max sensors. YOLOv12l achieved the highest recall (0.900), while YOLOv10x and YOLOv9 GELAN-c reported the top precision scores of 0.908 and 0.903, respectively. YOLOv9 GELAN-base and GELAN-e achieved the highest mAP@50 (0.935), followed by YOLO11s (0.933) and YOLOv12l (0.931). In counting validation, YOLO11n demonstrated superior accuracy, with RMSE values of 4.51–4.96 and MAE values of 3.85–7.73 across four apple varieties. Sensor-specific training on Intel RealSense further improved detection performance. YOLO11n also recorded the fastest inference speed (2.4 ms), outperforming YOLOv8n, YOLOv9 GELAN-s, YOLOv10n, and YOLOv12n, affirming its suitability for real-time orchard applications.
