4.7 Article

UIFGAN: An unsupervised continual-learning generative adversarial network for unified image fusion

Journal

INFORMATION FUSION
Volume 88, Issue -, Pages 305-318

Publisher

ELSEVIER
DOI: 10.1016/j.inffus.2022.07.013

Keywords

Image fusion; Continual-learning; Generative adversarial network; Max-gradient loss; Unified model

Funding

  1. National Natural Science Founda-tion of China [62075169, 62061160370, 62003247]
  2. Hubei Province Key Research and Development Program [2021BBA235]

Ask authors/readers for more resources

This paper introduces a novel unsupervised continual-learning generative adversarial network (UIFGAN) for unified image fusion, by training a single model through adversarial learning rather than multiple independent models. Experimental results demonstrate the superiority of this method over existing techniques.
In this paper, we propose a novel unsupervised continual-learning generative adversarial network for unified image fusion, termed as UIFGAN. In our model, for multiple image fusion tasks, a generative adversarial network for training a single model with memory in a continual-learning manner is proposed, rather than training an individual model for each fusion task or jointly training multiple tasks. We use elastic weight consolidation to avoid forgetting what has been learned from previous tasks when training multiple tasks sequentially. In each task, the generation of the fused image comes from the adversarial learning between a generator and a discriminator. Meanwhile, a max-gradient loss function is adopted for forcing the fused image to obtain richer texture details of the corresponding regions in two source images, which applies to most typical image fusion tasks. Extensive experiments on multi-exposure, multi-modal and multi-focus image fusion tasks demonstrate the advantages of our method over the state-of-the-art approaches.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available