Realistic acceleration of neural networks with fine-grained tensor decomposition
Published 2022 View Full Article
- Home
- Publications
- Publication Search
- Publication Details
Title
Realistic acceleration of neural networks with fine-grained tensor decomposition
Authors
Keywords
-
Journal
NEUROCOMPUTING
Volume 512, Issue -, Pages 52-68
Publisher
Elsevier BV
Online
2022-09-14
DOI
10.1016/j.neucom.2022.09.057
References
Ask authors/readers for more resources
Related references
Note: Only part of the references are listed.- Algorithms of matrix recovery based on truncated Schatten p-norm
- (2021) Chenglin Wen et al. International Journal of Machine Learning and Cybernetics
- QTTNet: Quantized tensor train neural networks for 3D object and video recognition
- (2021) Donghyun Lee et al. NEURAL NETWORKS
- Adjusting Learning Depth in Nonnegative Latent Factorization of Tensors for Accurately Modeling Temporal Patterns in Dynamic QoS Data
- (2021) Xin Luo et al. IEEE Transactions on Automation Science and Engineering
- Nonlinear tensor train format for deep neural network compression
- (2021) Dingheng Wang et al. NEURAL NETWORKS
- Training high-performance and large-scale deep neural networks with full 8-bit integers
- (2020) Yukuan Yang et al. NEURAL NETWORKS
- Model Compression and Hardware Acceleration for Neural Networks: A Comprehensive Survey
- (2020) By Lei Deng et al. PROCEEDINGS OF THE IEEE
- An 8.93 TOPS/W LSTM Recurrent Neural Network Accelerator Featuring Hierarchical Coarse-Grain Sparsity for On-Device Speech Recognition
- (2020) Deepak Kadetotad et al. IEEE JOURNAL OF SOLID-STATE CIRCUITS
- Compressing 3DCNNs based on tensor train decomposition
- (2020) Dingheng Wang et al. NEURAL NETWORKS
- Hybrid tensor decomposition in neural network compression
- (2020) Bijiao Wu et al. NEURAL NETWORKS
- Semisupervised Discriminant Multimanifold Analysis for Action Recognition
- (2019) Zengmin Xu et al. IEEE Transactions on Neural Networks and Learning Systems
- Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning
- (2019) Shaohui Lin et al. IEEE Transactions on Neural Networks and Learning Systems
- Tensor Networks for Latent Variable Analysis: Higher Order Canonical Polyadic Decomposition
- (2019) Anh-Huy Phan et al. IEEE Transactions on Neural Networks and Learning Systems
- Non-Negative Latent Factor Model Based on β-Divergence for Recommender Systems
- (2019) Luo Xin et al. IEEE Transactions on Systems Man Cybernetics-Systems
- An Instance-Frequency-Weighted Regularization Scheme for Non-Negative Latent Factor Analysis on High-Dimensional and Sparse Data
- (2019) Xin Luo et al. IEEE Transactions on Systems Man Cybernetics-Systems
- Long-Term Temporal Convolutions for Action Recognition
- (2018) Gul Varol et al. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
- Speaker-Adapted Confidence Measures for ASR Using Deep Bidirectional Recurrent Neural Networks
- (2018) Miguel Angel Del-Agua et al. IEEE-ACM Transactions on Audio Speech and Language Processing
- LTNN: A Layerwise Tensorized Compression of Multilayer Neural Network
- (2018) Hantao Huang et al. IEEE Transactions on Neural Networks and Learning Systems
- LSTM: A Search Space Odyssey
- (2017) Klaus Greff et al. IEEE Transactions on Neural Networks and Learning Systems
- Alternating Minimal Energy Methods for Linear Systems in Higher Dimensions
- (2014) Sergey V. Dolgov et al. SIAM JOURNAL ON SCIENTIFIC COMPUTING
- Tensor-Train Decomposition
- (2011) I. V. Oseledets SIAM JOURNAL ON SCIENTIFIC COMPUTING
- Hierarchical Singular Value Decomposition of Tensors
- (2010) Lars Grasedyck SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS
- Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness
- (2008) Lieven De Lathauwer SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS
Find Funding. Review Successful Grants.
Explore over 25,000 new funding opportunities and over 6,000,000 successful grants.
ExplorePublish scientific posters with Peeref
Peeref publishes scientific posters from all research disciplines. Our Diamond Open Access policy means free access to content and no publication fees for authors.
Learn More