基于知识蒸馏的叠层笼养蛋鸡行为识别模型研究
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

国家重点研发计划项目(2017YFE0122200)


Behavior Recognition Model of Stacked-cage Layers Based on Knowledge Distillation
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为了实现叠层笼养环境下蛋鸡行为的识别检测,构建了一种基于多教师模型融合的知识蒸馏蛋鸡行为识别模型,用多个教师模型融合指导学生网络训练。对基于Faster R-CNN框架的蛋鸡行为识别模型的特征提取网络进行知识蒸馏,以结构较复杂的ResNeXt、Res2Net和HRNet网络为教师网络,以结构较简单的ResNet 34网络为学生网络,通过知识蒸馏训练蛋鸡行为识别模型。试验结果表明,特征提取网络经过知识蒸馏后,蛋鸡行为识别模型性能得到显著提升,与特征提取网络未经过知识蒸馏的识别模型相比,模型准确率、平均精确度、召回率分别从93.6%、78.7%、86.2%提升至96.6%、89.9%、94.6%;学生模型经过知识蒸馏后基本达到了教师模型的性能水平,而模型参数量和模型计算量比教师模型降低了32%和33%,模型推理时间降低了66%。本研究提出的知识蒸馏模型通过较简单的网络结构获得了高精度的识别模型,为蛋鸡行为识别模型在小型嵌入式设备的部署提供了可能。

    Abstract:

    Animal behavior is closely related to animal welfare and health. It is an important means to evaluate animal welfare and health status through the identification and detection of animal behavior. In order to achieve the recognition and detection of layer behavior in the stackedcage system, a knowledge distillation layer behavior recognition model based on ensemble of multi-teacher model was constructed, and the student network training was guided by multiple teacher model. The knowledge distillation method was applied to the feature extraction network of layer behavior recognition model based on Faster R-CNN framework. Taking ResNeXt, Res2Net and HRNet networks as teacher networks and ResNet 34 networks as student networks, the layer behavior recognition model was trained through knowledge distillation method. The experimental results showed that the performance of layer behavior recognition model was significantly improved. Compared with the recognition model without knowledge distillation, the accuracy, average precision and recall of the model were improved from 93.6%, 78.7% and 86.2% to 96.6%, 89.9% and 94.6%, respectively. After knowledge distillation, the student model basically reached the performance level of the teacher model, while the number of parameters and computation were reduced 32% and 33% than the teacher model, and the inference time of the model was reduced by 66%. The knowledge distillation model proposed obtained a highprecision recognition model through a relatively simple network structure, which provided the possibility for the deployment of layer behavior recognition model in edge embedded devices.

    参考文献
    相似文献
    引证文献
引用本文

方鹏,郝宏运,王红英.基于知识蒸馏的叠层笼养蛋鸡行为识别模型研究[J].农业机械学报,2021,52(10):300-306. FANG Peng, HAO Hongyun, WANG Hongying. Behavior Recognition Model of Stacked-cage Layers Based on Knowledge Distillation[J]. Transactions of the Chinese Society for Agricultural Machinery,2021,52(10):300-306.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2021-05-16
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2021-07-20
  • 出版日期: