基于GEE云平台和数据融合的地表覆盖产品制作方法
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

国家自然科学基金重点项目(U22A20620)、国家自然科学基金面上项目(41971274)、河南省科技攻关计划项目(212102310028)和河南理工大学博士基金项目(B2021-16)


Production Method of Land Cover Data Based on GEE Cloud Platform and Data Fusion
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    地表覆盖产品是地理国情监测、生态系统评估、国土空间规划等活动的重要基础数据。GEE、PIE、微软行星云等遥感计算云平台具备丰富的数据源和强大算力。利用GEE云平台融合多套公开产品制作训练样本,可以显著降低产品更新的成本和周期,具有重要研究价值。本文以淮河流域为例,将欧洲航天局(ESA)和美国环境系统研究所(ESRI)存储在GEE平台上的2020年分辨率10m地表覆盖产品作为训练样本数据源,选用Sentinel-1雷达和Sentinel-2多光谱影像构建特征空间,利用随机森林分类方法制作分辨率10m的地表覆盖产品。为验证方法效果,进行了2组对比实验。实验1随机抽取1116个公开产品类别一致的样点作为训练样本,并通过目视解译方式验证本文产品与多套公开产品的精度。结果显示,本文产品总体精度为80.35%,相较于公开产品的总体精度提升2.89~8.94个百分点,局部刻画也更加精细;在Sentinel-2基础上加入雷达影像,总体精度提高3.52个百分点,雷达影像辅助效果明显。实验2设置8组不同数量的训练样本,并分别以人工判读、ESA、ESRI、DW、GlobeLand30为参考数据源,研究不同训练样本量和不同参考数据源对分类产品总体精度的影响。结果显示,随着训练样本不断增加,基于5种不同参考数据源的总体精度的提升幅度逐渐减小并趋于相对稳定。研究结果表明,借助GEE平台上的公开地表覆盖产品和海量遥感影像,可以快速提取高质量的训练样本,获得更高质量的分辨率10m地表覆盖产品,该方法具有重要的实践推广价值。

    Abstract:

    Land cover data are important basic data for activities such as geographic monitoring of national condition, ecosystem assessment, and land spatial planning. Remote sensing computing cloud platforms such as GEE, PIE, and Microsoft Planetary Cloud have rich data sources and powerful computing power. Using the GEE cloud platform to integrate multiple sets of public products to produce training samples can significantly reduce the cost and cycle of product updates, which has important research value. The Huaihe River Basin was taken as a research area. The 2020 10m resolution land cover data stored on the GEE platform by European Space Agency (ESA) and Environmental Systems Research Institute (ESRI) were used as training sample data sources. Sentinel-1 radar and Sentinel-2 multispectral images were selected to construct the feature space, and random forest classification method was used to generate a 10m resolution land cover data. To validate the effectiveness of the method, two sets of comparative experiments were conducted. In experiment 1, totally 1116 randomly selected sample points with consistent categories from ESA and ESRI products were used as training samples, and the accuracy of the product and multiple sets of public products were verified through visual interpretation. The results showed an overall accuracy of 80.35% for the product, with an improvement of 2.89~8.94 percentage points compared with the overall accuracy of the public products. It also provided a more detailed depiction of partial characteristics. By incorporating radar imagery with Sentinel-2, the overall accuracy was improved by 3.52 percentage points, indicating the clear benefits of radar imagery as an auxiliary data source. Experiment 2 set up eight sets of training samples with different numbers, and used manual interpretation, ESA, ESRI, DW, and GlobeLand30 as reference data sources to study the impact of different training sample sizes and reference data sources on the overall accuracy of classification products. The results showed that as the training sample size was increased, the improvement in overall accuracy based on the five different reference data sources gradually was decreased and reached a relatively stable level. The research results indicated that by utilizing public land cover data and massive remote sensing images on the GEE platform, high-quality training samples can be quickly extracted to produce higher quality 10m resolution land cover data. The method had significant practical and promotional value.

    参考文献
    相似文献
    引证文献
引用本文

王宇,林中云,赵胜楠,郭灵辉,李亚龙,任礼鹏.基于GEE云平台和数据融合的地表覆盖产品制作方法[J].农业机械学报,2023,54(8):211-217. WANG Yu, LIN Zhongyun, ZHAO Shengnan, GUO Linghui, LI Yalong, REN Lipeng. Production Method of Land Cover Data Based on GEE Cloud Platform and Data Fusion[J]. Transactions of the Chinese Society for Agricultural Machinery,2023,54(8):211-217.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2023-05-22
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-06-17
  • 出版日期: