基于卷积神经网络的无人机遥感农作物分类
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

国家重点研发计划项目(2017YFB0504203)和新疆生产建设兵团空间信息创新团队项目(2016AB001)


Crop Identification of Drone Remote Sensing Based on Convolutional Neural Network
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对采用长时间序列卫星影像、结合物候特征进行农作物精细分类识别精度较低的问题,将深度学习用于无人机遥感农作物识别,提出一种基于卷积神经网络的农作物精细分类方法,利用卷积神经网络提取高分辨率遥感影像中的农作物特征,通过调整网络参数及样本光谱组合,进一步优化网络结构,得到农作物识别模型。研究结果表明:卷积神经网络能够有效地提取影像中的农作物信息,实现农作物精细分类。除地块边缘因农作物种植稀疏、混杂而产生少许错分现象外,其他区域均得到较好的分类效果。经训练优化后的模型对3种农作物总体分类精度可达97.75%,优于SVM、BP神经网络等分类算法。

    Abstract:

    Crop identification and classification, as a fundamental part of modern agriculture, is an important prerequisite for ensuring regional food security and sustainable agricultural development. With the development of remote sensing technology, high-resolution visible light remote sensing imagery has become a convenient and reliable source of remote sensing data. At present, the use of these remote sensing data to carry out detailed classification research of typical crops has extremely important practical significance. However, these remote sensing images lack enough spectral information and therefore are unable to give high-accuracy recognition of the crops, it is necessary to apply deep learning techniques on the crops classification research to solve these problems. Based on composite wing unmanned aerial vehicle which equipped a remote sensing imaging equipment to obtain remote sensing data, including cotton, corn and squash, covering an area of over 867hm2. According to the characteristics of the data, the convolutional neural network (CNN) was designed to extract the crop classification information. By adjusting the network parameters and the sample spectral combination, the parameter optimization of the training process was divided into two levels: at the level of network parameters, the adjustment included the learning rate and the batch, the scale of the convolution kernel and the depth of the network;at the spectral feature level of the sample, three types of samples included single-band, dual-band and triple-band were constructed as inputs, and the model was trained in turn. The experimental results showed that the CNN can effectively extract the crop information in the image and realize the fine classification of crops. Most sensing areas had qualified classification result except the edge places planted with sparse and mixed crops. The overall classification accuracy achieved 97.75% by using the optimized CNN. Compared with the support vector machine based on radial basis kernel function and the back propagation neural network, the optimized CNN had the best effect and the highest classification accuracy. It was worthy of noting that the adjustment of network parameters would affect the final training effect. A CNN model with large learning rate (0.1), small convolutional kernel (7×7) and appropriate depth (7) was advised on the basis that the typical crops in the remote sensing images were with high density, small features and rich textures. This promised the small features in the sample would not be missed when the convergence of the training accuracy was increased. Samples with different spectral features also had an impact on the training of the model. The blue band in the visible light was more appropriate than the green and red ones on training the CNN for crops recognition. A combination of the three bands would improve the recognition accuracy and stability, but more training time was required since more input information was given. The experiments demonstrated the effectiveness and reliability of the proposed CNN on crops fine classification. This method can be regarded as a reference for the application of deep learning in agricultural remote sensing.

    参考文献
    相似文献
    引证文献
引用本文

汪传建,赵庆展,马永建,任媛媛.基于卷积神经网络的无人机遥感农作物分类[J].农业机械学报,2019,50(11):161-168.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2019-05-14
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2019-11-10
  • 出版日期: