Abstract:Abnormal plant removal is a critical step in ensuring seed purity during hybrid rice seed production. To prevent abnormal plants from producing abnormal pollen that could compromise hybrid vigor, current abnormal plant removal operations require repeated manual efforts, consuming significant time and labor. The automation of abnormal plant identification in the field is fundamental to achieve mechanized and automated removal. Aiming to achieve automated and precise detection of abnormal plant in hybrid rice seed production, UAV aerial images of hybrid rice seed production fields containing abnormal plants were collected, and high-quality and distortion-free images were obtained through center cropping. The abnormal plants in the images were annotated, and data augmentation was performed through geometric and color transformations to create a dataset of abnormal plants in hybrid rice seed production fields. To address the high similarity between abnormal and normal plants in the image dataset, a novel abnormal plant detection network model,T-CenterNet2, was proposed. This model enhanced the CenterNet2 network by incorporating a texture-aware module within the feature pyramid network, which reorganized channel information to extract texture features from the feature maps, thereby increasing the feature distinction between abnormal plants and the background. Additionally, a combination of loss functions was designed, including a texture loss that measured the difference between texture features and label ground truth to control the texture-aware module. DIoU was introduced as the bounding box loss to improve the accuracy of target center point predictions, in line with the practical requirements of abnormal plant removal operations. The effects of different loss function combinations on model convergence speed and detection accuracy were compared, with the combination of weighted texture loss and DIoU yielding the best results, demonstrating the effectiveness of the redesigned loss function for abnormal plant detection tasks. Using mAP and recall rate as evaluation metrics, the improved model was compared with the original CenterNet2 model and four typical models—Faster R-CNN, FCOS,YOLOX, and DeTR. Experimental results showed that the improved T-CenterNet2 model achieved an mAP of 86.4%, an increase of 11 percentage points over the original model, and a recall rate of 82.5%, an increase of 11.6 percentage points over the original model. The highest mAP and recall rate among the typical models were only 73.1% and 66.2%, respectively. The enhanced model exhibited high detection accuracy and robustness, effectively achieving reliable abnormal plant detection.