4.5 Article

Breast cancer histopathological image classification using attention high-order deep network

Journal

Publisher

WILEY
DOI: 10.1002/ima.22628

Keywords

breast cancer histopathological image classification; convolutional neural network; covariance pooling; efficient channel attention; second-order statistics

Funding

  1. Key R&D Program of Liaoning Province [2019JH2/10100030]
  2. Liaoning BaiQianWan Talents Program
  3. National Natural Science Foundation of China [61972062]
  4. Natural Science Foundation of Liaoning Province [2019-MS-011]
  5. University-Industry Collaborative Education Program [201902029013]
  6. Young and Middle-aged Talents Program of the National Civil Affairs Commission

Ask authors/readers for more resources

Computer-aided classification of breast cancer pathological images using deep learning methods, particularly the novel AHoNet, achieved high accuracy in patient-level classification on public datasets, demonstrating competitive performance in medical image applications.
Computer-aided classification of pathological images is of the great significance for breast cancer diagnosis. In recent years, deep learning methods for breast cancer pathological image classification have made breakthrough progress, becoming the mainstream in this field. To capture more discriminant deep features for breast cancer pathological images, this work introduces a novel attention high-order deep network (AHoNet) by simultaneously embedding attention mechanism and high-order statistical representation into a residual convolutional network. AHoNet firstly employs an efficient channel attention module with non-dimensionality reduction and local cross-channel interaction to achieve local salient deep features of breast cancer pathological images. Then, their second-order covariance statistics are further estimated through matrix power normalization, which provides a more robust global feature presentation of breast cancer pathological images. We extensively evaluate AHoNet on the public BreakHis and BACH breast cancer pathology datasets. Experimental results illustrate that AHoNet gains the optimal patient-level classification accuracies of 99.29% and 85% on the BreakHis and BACH database, respectively, demonstrating the competitive performance with state-of-the-art single models on this medical image application.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available