Behavior Recognition Model of Stacked-cage Layers Based on Knowledge Distillation
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Animal behavior is closely related to animal welfare and health. It is an important means to evaluate animal welfare and health status through the identification and detection of animal behavior. In order to achieve the recognition and detection of layer behavior in the stackedcage system, a knowledge distillation layer behavior recognition model based on ensemble of multi-teacher model was constructed, and the student network training was guided by multiple teacher model. The knowledge distillation method was applied to the feature extraction network of layer behavior recognition model based on Faster R-CNN framework. Taking ResNeXt, Res2Net and HRNet networks as teacher networks and ResNet 34 networks as student networks, the layer behavior recognition model was trained through knowledge distillation method. The experimental results showed that the performance of layer behavior recognition model was significantly improved. Compared with the recognition model without knowledge distillation, the accuracy, average precision and recall of the model were improved from 93.6%, 78.7% and 86.2% to 96.6%, 89.9% and 94.6%, respectively. After knowledge distillation, the student model basically reached the performance level of the teacher model, while the number of parameters and computation were reduced 32% and 33% than the teacher model, and the inference time of the model was reduced by 66%. The knowledge distillation model proposed obtained a highprecision recognition model through a relatively simple network structure, which provided the possibility for the deployment of layer behavior recognition model in edge embedded devices.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:May 16,2021
  • Revised:
  • Adopted:
  • Online: July 20,2021
  • Published:
Article QR Code