Extraction of Crop Height and Cut-edge Information Based on Binocular Vision
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Crop height and cut-edge are important factors to be considered in unmanned rice wheat combine harvester, because the height of sowing wheel is adjusted according to the crop height and cut-edge provides navigation information. Therefore, field crop height and cut-edge information were extracted based on binocular vision. The 3D data and RGB image were acquired by binocular camera. The 3D data on the flat ground were filtered by voxels and through filters, and the filtered data was fitted to the initial plane by RANSAC algorithm. The real-time plane of harvesting operation was calculated with posture changes of harvester reflected by IMU data, and the 3D data was transformed into the actual height according to the distance from point to plane. An improved method combining density peak clustering and K-means clustering was proposed to classify the height data. At the same time, the RGB image was normalized and then segmented by Otsu algorithm to extract the upper region of crop. The common region of the cluster with the largest cluster center value and the upper crop region were obtained, and the mean value of the height data belonging to the common region was calculated to obtain the crop height. Based on the cross-correlation between the height data series and the model function, the cut-edge points were extracted. The cut-edge points were fitted to the cut-edge line by the least square method. According to the current boundary line, the candidate range of the next frame data cut-edge points was predicted. The heading deviation and lateral deviation were calculated by the cut-edge line. Experiments showed that this method could effectively extract the crop height and cut-edge information, and the mean absolute error of height was 0.043m and the correct rate of boundary recognition was 93.30% under the complex harvest scenes including sparse, missed cutting and rutting. The average angle error of heading deviation was 1.04°, and the average absolute error of lateral deviation was 0.084m. Therefore, the method had application value to unmanned self-adaptive control of combine harvester.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:March 24,2021
  • Revised:
  • Adopted:
  • Online: March 10,2022
  • Published:
Article QR Code