Abstract:Aiming to address the challenge where canopy occlusion and repetitive features compromise unmanned aerial vehicle (UAV) pose estimation systems in orchard environments, a multi-feature fusion-based pose estimation method for UAVs in orchards was designed by introducing visual-inertial odometry technology and incorporating geometric constraints from point and line features. Firstly, the EDLines algorithm replaced the traditional LSD for extracting line features, while optical flow enabled rapid tracking and matching of feature points and lines across consecutive frames, with feature poses obtained through 3D feature reconstruction. Secondly, a tightly coupled pose estimation model was constructed to fuse inertial and visual information, within a local sliding window framework, a jointly minimized global cost function was established, the accurate position and attitude information of the orchard UAV was obtained by solving the cost function through optimization methods. Finally, comparative experiments were conducted in fruit-bearing apple orchards and grape greenhouses, with absolute trajectory error and relative trajectory error serving as evaluation metrics to validate the method’s effectiveness. Experimental results demonstrated that compared with traditional pose estimation methods utilizing LSD algorithm-extracted line features, the proposed method reduced the average absolute trajectory error by 10% and the average relative trajectory error by 27%. This approach effectively enhanced the accuracy and robustness of orchard drone navigation systems, providing reliable support for ensuring the safety of orchard drone operations.