4.7 Article

DRINet for Medical Image Segmentation

期刊

IEEE TRANSACTIONS ON MEDICAL IMAGING
卷 37, 期 11, 页码 2453-2462

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TMI.2018.2835303

关键词

Convolutional neural network; medical image segmentation; brain atrophy; abdominal organ segmentation

资金

  1. NIHR Grant i4i: Decision-assist software for management of acute ischemic stroke using brain-imaging machine-learning [II-LA-0814-20007]
  2. JSPS Kakenhi [26108006, 17K20099]
  3. National Institutes of Health Research (NIHR) [II-LA-0814-20007] Funding Source: National Institutes of Health Research (NIHR)

向作者/读者索取更多资源

Convolutional neural networks (CNNs) have revolutionized medical image analysis over the past few years. The U-Net architecture is one of the most well-known CNN architectures for semantic segmentation and has achieved remarkable successes in many different medical image segmentation applications. The U-Net architecture consists of standard convolution layers, pooling layers, and upsampling layers. These convolution layers learn representative features of input images and construct segmentations based on the features. However, the features learned by standard convolution layers are not distinctive when the differences among different categories are subtle in terms of intensity, location, shape, and size. In this paper, we propose a novel CNN architecture, called Dense-Res-Inception Net (DRINet), which addresses this challenging problem. The proposed DRINet consists of three blocks, namely a convolutional block with dense connections, a deconvolutional block with residual inception modules, and an unpooling block. Our proposed architecture outperforms the U-Net in three different challenging applications, namely multi-class segmentation of cerebrospinal fluid on brain CT images, multi-organ segmentation on abdominal CT images, and multi-class brain tumor segmentation on MR images.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据