基于YOLO v7和改进U-Net模型的鸡冠肉垂提取与面积计算方法
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

国家自然科学基金项目(32172779)、财政部和农业农村部:国家现代农业产业技术体系项目(CARS-40)、河北省现代农业产业技术体系建设专项资金项目( HBCT2023210201) 和保定市科技计划项目(2211N014)


Extraction and Area Calculation of Chicken Comb and Wattle Based on YOLO v7 and Optimized U-Net Network
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    传统人工测量方法在蛋鸡鸡冠肉垂面积测算中存在接触性应激风险、人畜共患病隐患及测量误差较大等问题。为此,本研究提出基于YOLO v7与改进U-Net的鸡冠肉垂自动分割与面积计算方法。构建两阶段检测框架:利用YOLO v7完成鸡头姿态筛选与ROI提取,有效消除非正视角图像干扰;提出融合Contextual Transformer的CoT-UNet模型:通过将CoT块融入U-Net编码器实现动态和静态上下文特征融合,结合本文构建的DyC-UP上采样模块(采用动态可调卷积核强化不规则边缘特征提取),显著提升不同鸡冠特征分割能力;建立像素-面积转换算法:基于标定系数实现从图像空间到物理空间的精准映射。实验结果表明,改进CoT-UNet相较基线模型,在鸡冠和肉垂分割任务中,IoU提升4.77、8.75个百分点,精确率提升5.31、5.06个百分点,分割质量改善显著。在面积计算精度方面,鸡冠面积绝对误差(0.62~3.50cm2)和肉垂面积绝对误差(0.10~2.93cm2)较传统手工测量(3.58~7.27cm2)具有明显优势。多场景验证显示,在不同姿态(3类)、拍摄角度(2种)和距离(2种)条件下,鸡冠面积相对误差为2.41%~13.62%,肉垂面积相对误差为1.00%~29.21%。本研究实现了非接触式禽类生物特征精准测量,为智慧化种鸡选育提供了可靠的技术支持。

    Abstract:

    Traditional manual measurement of poultry comb and wattle areas poses contact-induced stress risks, zoonotic disease transmission hazards, and substantial measurement errors. A non-contact measurement system integrating YOLO v7 with an improved U-Net architecture was proposed. Three key innovations were presented: a dual-stage detection framework utilizing YOLO v7 for head pose screening and ROI extraction, effectively eliminating offangle image interference; a novel CoT-UNet model incorporating Contextual Transformer blocks into U-Net’s encoder for dynamic-static context fusion, combined with DyC-UP module employing dynamically adjustable convolution kernels to enhance irregular edge detection; a pixelarea conversion algorithm achieving precise spatial mapping through calibration coefficients. Experimental results demonstrated significant improvements: the enhanced CoT-UNet outperformed baseline models by 4.77 percentage points (comb) and 8.75 percentage points (wattle) in IoU, along with 5.31 percentage points and 5.06 percentage points precision gains respectively. Absolute measurement errors for comb (0.62~3.50cm2) and wattle (0.10~2.93cm2) showed marked superiority over manual methods (3.58~7.27cm2). Multi-scenario validation revealed stable relative errors of 2.41%~13.62% for combs and 1.00%~29.21% for wattles across varied postures (three types), angles (two positions), and distances (two levels). This automated system enabled stress-free poultry biometric measurement, providing reliable technical support for intelligent breeding selection.

    参考文献
    相似文献
    引证文献
引用本文

杨断利,沈洪硕,陈辉,高媛.基于YOLO v7和改进U-Net模型的鸡冠肉垂提取与面积计算方法[J].农业机械学报,2025,56(4):415-426. YANG Duanli, SHEN Hongshuo, CHEN Hui, GAO Yuan. Extraction and Area Calculation of Chicken Comb and Wattle Based on YOLO v7 and Optimized U-Net Network[J]. Transactions of the Chinese Society for Agricultural Machinery,2025,56(4):415-426.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-03-04
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2025-04-10
  • 出版日期:
文章二维码