4.4 Review

Transfer learning for medical image classification: a literature review

期刊

BMC MEDICAL IMAGING
卷 22, 期 1, 页码 -

出版社

BMC
DOI: 10.1186/s12880-022-00793-7

关键词

Deep learning; Transfer learning; Fine-tuning; Convolutional neural network; Medical image analysis

资金

  1. Projekt DEAL
  2. German Ministry for Education and Research (BMBF) [01ZZ1801E]

向作者/读者索取更多资源

Transfer learning with convolutional neural networks has made significant contributions to medical image analysis by leveraging prior knowledge from similar tasks to improve performance on new tasks. This review paper provides guidance on selecting models and transfer learning approaches for medical image classification. The majority of studies evaluated multiple models empirically, with deep models like Inception being the most commonly used. Deep models as feature extractors, such as ResNet or Inception, are recommended to save computational costs and time without compromising predictive power.
Background Transfer learning (TL) with convolutional neural networks aims to improve performances on a new task by leveraging the knowledge of similar tasks learned in advance. It has made a major contribution to medical image analysis as it overcomes the data scarcity problem as well as it saves time and hardware resources. However, transfer learning has been arbitrarily configured in the majority of studies. This review paper attempts to provide guidance for selecting a model and TL approaches for the medical image classification task. Methods 425 peer-reviewed articles were retrieved from two databases, PubMed and Web of Science, published in English, up until December 31, 2020. Articles were assessed by two independent reviewers, with the aid of a third reviewer in the case of discrepancies. We followed the PRISMA guidelines for the paper selection and 121 studies were regarded as eligible for the scope of this review. We investigated articles focused on selecting backbone models and TL approaches including feature extractor, feature extractor hybrid, fine-tuning and fine-tuning from scratch. Results The majority of studies (n = 57) empirically evaluated multiple models followed by deep models (n = 33) and shallow (n = 24) models. Inception, one of the deep models, was the most employed in literature (n = 26). With respect to the TL, the majority of studies (n = 46) empirically benchmarked multiple approaches to identify the optimal configuration. The rest of the studies applied only a single approach for which feature extractor (n = 38) and fine-tuning from scratch (n = 27) were the two most favored approaches. Only a few studies applied feature extractor hybrid (n = 7) and fine-tuning (n = 3) with pretrained models. Conclusion The investigated studies demonstrated the efficacy of transfer learning despite the data scarcity. We encourage data scientists and practitioners to use deep models (e.g. ResNet or Inception) as feature extractors, which can save computational costs and time without degrading the predictive power.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据