4.7 Article

SNLRUX plus plus for Building Extraction From High-Resolution Remote Sensing Images

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSTARS.2021.3135705

Keywords

Remote sensing; Feature extraction; Semantics; Image segmentation; Buildings; Task analysis; Deep learning; Building extraction; convolution neural network; deep learning; high-resolution image; remote sensing

Funding

  1. National Natural Science Foundation of China [61906168, 61902351]
  2. Natural Science Foundation of Zhejiang Province [LY21F020023]

Ask authors/readers for more resources

Building extraction is crucial in high-resolution remote sensing image processing, serving as the foundation for urban planning and demographic analysis. This paper presents a novel deep learning network called SNLRUX++ for building extraction. Experimental results demonstrate the effectiveness and generalization ability of the proposed method.
Building extraction plays an important role in high-resolution remote sensing image processing, which can be used as the basis for urban planning and demographic analysis. In recent years, many powerful general semantic segmentation models have emerged, but these models often perform poorly when transferred to remote sensing images because of the characteristics of remote sensing images. To this end, we propose a new deep learning network called Selective Nonlocal ResUNeXt++ (SNLRUX++) for building extraction. First, the cascaded multiscale feature fusion is proposed to transform the high-performance image classification network ResNeXt into the segmentation network ResUNeXt++. Second, selective nonlocal operation is designed to establish long-range dependencies while avoiding introducing excessive noise and computational effort. Finally, multiscale prediction is applied as deep supervision to accelerate training and convergence, and improves prediction performance of objects at different scales. The experimental results on two different remote sensing image datasets show the effectiveness and generalization ability of the proposed method.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available