4.7 Article

Nonlinear Transform Coding

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSTSP.2020.3034501

关键词

Transforms; Entropy; Distortion; Image coding; Vector quantization; Stochastic processes; Artificial neural networks; data compression; machine learning; rate-distortion; source coding; transform coding; unsupervised learning

向作者/读者索取更多资源

Research indicates that nonlinear transform coding (NTC) is competitive with linear transform codecs for images and outperforms them in terms of perceptual quality metrics. Researchers evaluated the empirical rate-distortion performance of NTC using simple example sources and introduced a novel variant of vector quantization.
We review a class of methods that can be collected under the name nonlinear transform coding (NTC), which over the past few years have become competitive with the best linear transform codecs for images, and have superseded them in terms of rate-distortion performance under established perceptual quality metrics such as MS-SSIM. We assess the empirical rate-distortion performance of NTC with the help of simple example sources, for which the optimal performance of a vector quantizer is easier to estimate than with natural data sources. To this end, we introduce a novel variant of entropy-constrained vector quantization. We provide an analysis of various forms of stochastic optimization techniques for NTC models; review architectures of transforms based on artificial neural networks, as well as learned entropy models; and provide a direct comparison of a number of methods to parameterize the rate-distortion trade-off of nonlinear transforms, introducing a simplified one.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据