4.6 Article

Multiple dictionary pairs learning and sparse representation-based infrared image super-resolution with improved fuzzy clustering

Journal

SOFT COMPUTING
Volume 22, Issue 5, Pages 1385-1398

Publisher

SPRINGER
DOI: 10.1007/s00500-017-2812-3

Keywords

Multi-sensor; Super-resolution; Sparse representation; Infrared image; Dictionary learning; Multiview representation; Fuzzy clustering theory

Funding

  1. National Natural Science Foundation of China [61701327, 61711540303, 61473198]
  2. Priority Academic Program Development of Jiangsu Higher Education Institutions(PAPD) Fund
  3. Jiangsu Collaborative Innovation Center on Atmospheric Environment and Equipment Technology(CICAEET) Fund

Ask authors/readers for more resources

Exploring sparse representation to enhance the resolution of infrared image has attracted much attention in the last decade. However, conventional sparse representation-based super-resolution aim at learning a universal and efficient dictionary pair for image representation. However, considering that a large number of different structures exist in an image, it is insufficient and unreasonable to present various image structures with only one universal dictionary pair. In this paper, we propose an improved fuzzy clustering and weighted scheme reconstruction framework to solve this problem. Firstly, the training patches are divided into multiple clusters by joint learning multiple dictionary pairs with improved fuzzy clustering method. The goal of joint learning is to learn the multiple dictionary pairs which could collectively represent all the training patches with smallest reconstruction error. So that the learned dictionary pairs are more precise and mutually complementary. Then, high-resolution (HR) patches are estimated according to several most accurate dictionary pairs. Finally, these estimated HR patches are integrated together to generate a final HR patch by a weighted scheme. Numerous experiments demonstrate that this framework outperforms some state-of-art super-resolution methods in both quantitatively and perceptually.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available