4.6 Article

Wavelet-based residual attention network for image super-resolution

Journal

NEUROCOMPUTING
Volume 382, Issue -, Pages 116-126

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2019.11.044

Keywords

Super-resolution; Wavelet transform; Multi-kernel convolution; Channel attention; Spatial attention

Funding

  1. National Natural Science Foundation of China [81873894]

Ask authors/readers for more resources

Image super-resolution (SR) is a fundamental technique in the field of image processing and computer vision. Recently, deep learning has witnessed remarkable progress in many super-resolution approaches. However, we observe that most studies focus on designing deeper and wider architectures to improve the quality of image SR at the cost of computational burden and speed. Few researches adopt lightweight but effective modules to improve the efficiency of SR without compromising its performance. In this paper, we propose the Wavelet-based residual attention network (WRAN) for image SR. Specifically, the input and label of our network are four coefficients generated by the two-dimensional (2D) Wavelet transform, which reduces the training difficulty of our network by explicitly separating low-frequency and high-frequency details into four channels. We propose the multi-kernel convolutional layers as basic modules in our network, which can adaptively aggregate features from various sized receptive fields. We adopt the residual attention block (RAB) that contains channel attention and spatial attention modules. Thus, our method can focus on more crucial underlying patterns in both channel and spatial dimensions in a lightweight manner. Extensive experiments validate that our WRAN is computationally efficient and demonstrate competitive results against state-of-the-art SR methods. (C) 2019 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available