4.6 Article

SPRNet: Single-Pixel Reconstruction for One-Stage Instance Segmentation

期刊

IEEE TRANSACTIONS ON CYBERNETICS
卷 51, 期 4, 页码 1731-1742

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2020.2969046

关键词

Semantics; Detectors; Image segmentation; Object detection; Task analysis; Proposals; Image reconstruction; Computer vision; deep learning; instance segmentation; object detection; video analyze

资金

  1. National Natural Science Foundation of China [61836002, 61972361, 61702143]
  2. Zhejiang Provincial Natural Science Foundation of China [LY17F020009]
  3. Australian Research Council [FL-170100117, DP-180103424, IH-180100002]

向作者/读者索取更多资源

In this article, a one-stage framework SPRNet is proposed for efficient instance segmentation, achieving comparable mask AP with Mask R-CNN at a higher speed and gaining overall improvements on box AP at every scale compared with RetinaNet.
Object instance segmentation is one of the most fundamental but challenging tasks in computer vision, and it requires the pixel-level image understanding. Most existing approaches address this problem by adding a mask prediction branch to a two-stage object detector with the region proposal network (RPN). Although producing good segmentation results, the efficiency of these two-stage approaches is far from satisfactory, restricting their applicability in practice. In this article, we propose a one-stage framework, single-pixel reconstruction net (SPRNet), which performs efficient instance segmentation by introducing a single-pixel reconstruction (SPR) branch to off-the-shelf one-stage detectors. The added SPR branch reconstructs the pixel-level mask from every single pixel in the convolution feature map directly. Using the same ResNet-50 backbone, SPRNet achieves comparable mask AP with Mask R-CNN at a higher inference speed and gains all-round improvements on box AP at every scale compared with RetinaNet.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据