thumbnail

Topic

Technical equipment testing

Volume

Volume 74 / No. 3 / 2024

Pages : 856-866

Metrics

Volume viewed 0 times

Volume downloaded 0 times

POTATO APPEARANCE DETECTION ALGORITHM BASED ON IMPROVED YOLOV8

基于改进YOLOV8的马铃薯外观品相检测算法

DOI : https://doi.org/10.35633/inmateh-74-76

Authors

(*) Zhiguo PAN

Qingdao Agricultural University

Huan ZHANG

Qingdao Agricultural University

Zhen LIU

Qingdao Agricultural University

Ranbing YANG

Hainan University

Zhaoming SU

Qingdao Agricultural University

Xinlin LI

Qingdao Agricultural University

Zeyang LIU

Qingdao Agricultural University

Chuanmiao SHI

Qingdao Agricultural University

Shuai WANG

Qingdao Agricultural University

Hongzhu WU

Qingdao Hongzhu Agricultural Machinery Co., Ltd.

(*) Corresponding authors:

[email protected] |

Zhiguo PAN

Abstract

To meet the demands for rapid and accurate appearance inspection in potato sorting, this study proposes a potato appearance detection algorithm based on an improved version of YOLOv8. MobileNetV4 is employed to replace the YOLOv8 backbone network, and a triple attention mechanism is introduced to the neck network along with the Inner-CIoU loss function to accelerate convergence and enhance the accuracy of potato appearance detection. Experimental results demonstrate that the proposed YOLOv8 model achieves precision, recall, and mean average precision of 91.4%, 87.7%, and 93.7% respectively on the test set. Compared to YOLOv5s, YOLOv7tiny, and the original base network, it exhibits minimal memory usage while improving the [email protected] by 1.1, 0.9, and 0.3 percentage points respectively, providing a reference for potato quality inspection.

Abstract in Chinese

为满足马铃薯分拣过程中对外观品相检测快速、准确的需求,本研究提出了一种基于改进YOLOv8的马铃薯外观品相检测算法。使用MobileNetV4替换YOLOv8主干网络,颈部网络引入三重注意力机制,Inner-CIoU损失函数,加速收敛,提升马铃薯品相检测准确率。实验结果表明,提出的 YOLOv8 模型在测试集上的精确率、召回率和平均精度分别为91.4%、87.7%、93.7%,相比较 YOLOv5s、YOLOv7tiny和原基础网络,模型内存占用最少的同时的 [email protected] 分别提升了 1.1、0.9、0.3个百分点,为马铃薯品质检测提供参考。

Indexed in

Clarivate Analytics.
 Emerging Sources Citation Index
Scopus/Elsevier
Google Scholar
Crossref
Road