thumbnail

Topic

Technologies and technical equipment for agriculture and food industry

Volume

Volume 67 / No. 2 / 2022

Pages : 364-373

Metrics

Volume viewed 0 times

Volume downloaded 0 times

RECOGNITION METHOD FOR SEED POTATO BUDS BASED ON IMPROVED YOLOV3-TINY

基于改进YOLOv3-tiny的马铃薯种薯芽眼检测方法

DOI : https://doi.org/10.35633/inmateh-67-37

Authors

(*) Wanzhi ZHANG

Shandong Agricultural University, College of Mechanical and Electronic Engineering

Yuelin HAN

Shandong Agricultural University, College of Mechanical and Electronic Engineering

Chen HUANG

Shandong Agricultural University, College of Mechanical and Electronic Engineering

Zhiwei CHEN

Shandong Agricultural University, College of Mechanical and Electronic Engineering

(*) Corresponding authors:

[email protected] |

Wanzhi ZHANG

Abstract

This paper proposed a method of seed potato buds recognition based on improved YOLOv3-tiny. K-means clustering based on IoU is used to obtain the anchor box that meets the size of buds. Mosaic online data enhancement is used to increase image diversity and model generalization ability. The CIoU bounding box regression loss function is introduced to improve the regression effect of buds recognition. The results show that the precision (P), the recall (R), the average precision (AP), and the F1 score of the model for seed potato buds recognition are 88.33%, 85.97%, 91.18% and 87.13% respectively. The real-time recognition speed of seed potato buds on the embedded platform NVIDIA Jetson Nano can reach 40FPS. The method proposed in this paper can meet the needs of real-time recognition of seed potato buds on the embedded platform.

Abstract in Chinese

本文提出一种基于YOLOv3-tiny网络的马铃薯种薯芽眼检测方法,以YOLOv3-tiny网络为基础,使用基于IoU的K-means聚类方法得到符合芽眼尺寸的先验框,以减少先验框带来的误差;使用Mosaic在线数据增强方式,以增加图像多样性和模型泛化能力;引入CIoU边框回归损失函数,以提高芽眼检测回归效果。结果表明,模型对马铃薯种薯芽眼检测的精准率P、召回率R、平均准确率AP和调和均值F1值分别为88.33%、85.97%、91.18%和87.13%;嵌入式平台NVIDIA Jetson Nano种薯芽眼实时检测速度可达40FPS。本文方法能够满足在嵌入端马铃薯种薯芽眼实时检测的需求,可为后续马铃薯种薯自动化切块芽眼检测提供了技术支持。

Indexed in

Clarivate Analytics.
 Emerging Sources Citation Index
Scopus/Elsevier
Google Scholar
Crossref
Road