VMDM-fusion: a saliency feature representation method for infrared and visible image fusion
Published 2021 View Full Article
- Home
- Publications
- Publication Search
- Publication Details
Title
VMDM-fusion: a saliency feature representation method for infrared and visible image fusion
Authors
Keywords
-
Journal
Signal Image and Video Processing
Volume -, Issue -, Pages -
Publisher
Springer Science and Business Media LLC
Online
2021-01-12
DOI
10.1007/s11760-021-01852-2
References
Ask authors/readers for more resources
Related references
Note: Only part of the references are listed.- U2Fusion: A Unified Unsupervised Image Fusion Network
- (2020) Han Xu et al. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
- Medical Image Fusion via Convolutional Sparsity Based Morphological Component Analysis
- (2019) Yu Liu et al. IEEE SIGNAL PROCESSING LETTERS
- Infrared and visible image fusion via detail preserving adversarial learning
- (2019) Jiayi Ma et al. Information Fusion
- Weakly supervised deep network for spatiotemporal localization and detection of human actions in wild conditions
- (2019) N. Kumar et al. VISUAL COMPUTER
- Convolutional Neural Network for Intermediate View Enhancement in Multiview Streaming
- (2018) Li Yu et al. IEEE TRANSACTIONS ON MULTIMEDIA
- Deep learning for pixel-level image fusion: Recent advances and future prospects
- (2018) Yu Liu et al. Information Fusion
- Joint medical image fusion, denoising and enhancement via discriminative low-rank sparse dictionaries learning
- (2018) Huafeng Li et al. PATTERN RECOGNITION
- Infrared and visible image fusion methods and applications: A survey
- (2018) Jiayi Ma et al. Information Fusion
- DenseFuse: A Fusion Approach to Infrared and Visible Images
- (2018) Hui Li et al. IEEE TRANSACTIONS ON IMAGE PROCESSING
- FusionGAN: A generative adversarial network for infrared and visible image fusion
- (2018) Jiayi Ma et al. Information Fusion
- Image Fusion With Cosparse Analysis Operator
- (2017) Rui Gao et al. IEEE SIGNAL PROCESSING LETTERS
- Pixel-level image fusion: A survey of the state of the art
- (2017) Shutao Li et al. Information Fusion
- Multi-focus image fusion with a deep convolutional neural network
- (2017) Yu Liu et al. Information Fusion
- Infrared and visible image fusion based on visual saliency map and weighted least square optimization
- (2017) Jinlei Ma et al. INFRARED PHYSICS & TECHNOLOGY
- Image Fusion With Convolutional Sparse Representation
- (2016) Yu Liu et al. IEEE SIGNAL PROCESSING LETTERS
- Infrared and visible image fusion via gradient transfer and total variation minimization
- (2016) Jiayi Ma et al. Information Fusion
- Detail preserved fusion of visible and infrared images using regional saliency extraction and multi-scale image decomposition
- (2015) Guangmang Cui et al. OPTICS COMMUNICATIONS
- Fusion method for infrared and visible images by using non-negative sparse representation
- (2014) Jun Wang et al. INFRARED PHYSICS & TECHNOLOGY
- Exploring Visual and Motion Saliency for Automatic Video Object Extraction
- (2013) Wei-Te Li et al. IEEE TRANSACTIONS ON IMAGE PROCESSING
- Directive Contrast Based Multimodal Medical Image Fusion in NSCT Domain
- (2013) Gaurav Bhatnagar et al. IEEE TRANSACTIONS ON MULTIMEDIA
- Group-Sparse Representation With Dictionary Learning for Medical Image Denoising and Fusion
- (2012) Shutao Li et al. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING
- Guided Image Filtering
- (2012) Kaiming He et al. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
- Multifocus Image Fusion and Restoration With Sparse Representation
- (2009) Bin Yang et al. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT
- Assessment of image fusion procedures using entropy, image quality, and multispectral classification
- (2008) Jan Van Aardt Journal of Applied Remote Sensing
Become a Peeref-certified reviewer
The Peeref Institute provides free reviewer training that teaches the core competencies of the academic peer review process.
Get StartedAsk a Question. Answer a Question.
Quickly pose questions to the entire community. Debate answers and get clarity on the most important issues facing researchers.
Get Started