Abstract:Aiming to addressing challenges of low accuracy and poor adaptability of navigation line extraction algorithms in complex agricultural environments for maize-soybean intercropping scenarios, an improved YOLO v8-based method for extracting crop row navigation lines was proposed to enhance autonomous mobile platform navigation precision during field operations. For the specialized task of segmenting maize and soybean crop rows, a StarNet-YOLO backbone network was constructed by integrating the StarNet network with YOLO v8 and optimizing the detection head. The network was enhanced through strategies, including a custom-designed ASPPFE module, depth-separable convolution, and CSE structure optimization, while also implementing lightweight design by using the LAMP pruning algorithm. Additionally, the Douglas-Peucker algorithm was introduced to approximate crop row contours, and a scoring mechanism was developed to determine the midpoints of contour start and end segments, enabling precise fitting of crop row navigation lines. Ablation experiments showed that ASPPFE achieved an mean average precision for instance segmentation at 0.5 IoU (mAP50seg) of 99.5%, with its mAP across IoU thresholds 0.5~0.95 (mAP50-95seg) improved by 1.0, 1.0, and 0.4 percentage points compared with that of SPPELAN, SPPF, and ASPPF, respectively. After 25% pruning optimization, the StarNet-YOLO network’s mAP50-95seg was decreased by only 0.02 percentage points, while inference speed was increased from 390f/s to 563f/s, and floating-point operations were reduced from 7.2×109 to 4.7×109. Comparative testing on the same dataset showed that StarNet-YOLO’s mAP50-95seg outperformed YOLO v5, YOLO v7, and baseline YOLO v8 by 5.5, 4.8, and 2.8 percentage points, respectively. Validation of crop row navigation line fitting revealed average angular and distance errors of 2.01° and 23.17 pixels. This navigation line extraction algorithm demonstrated excellent performance in complex agricultural environments, balancing detection speed and accuracy, and provided a technical approach for visual navigation of autonomous robots operating in maize, soybean, and other crop fields.