4.6 Review

Advanced metaheuristic optimization techniques in applications of deep neural networks: a review

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 33, Issue 21, Pages 14079-14099

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-021-05960-5

Keywords

Metaheuristic Optimization; Algorithms; Deep neural networks; Applications; A review

Funding

  1. Hubei Provincinal Science and Technology Major Project of China [2020AEA011]
  2. Key Research & Developement Plan of Hubei Province of China [2020BAB100]

Ask authors/readers for more resources

This study explores the recent optimization methods used to enhance the performance of deep neural networks and emphasizes the importance of generating the optimal structures and parameters of DNNs when considering massive-scale data. It also identifies several potential directions that still need improvements and open problems in evolutionary DNNs.
Deep neural networks (DNNs) have evolved as a beneficial machine learning method that has been successfully used in various applications. Currently, DNN is a superior technique of extracting information from massive sets of data in a self-organized method. DNNs have different structures and parameters, which are usually produced for particular applications. Nevertheless, the training procedures of DNNs can be protracted depending on the given application and the size of the training set. Further, determining the most precise and practical structure of a deep learning method in a reasonable time is a possible problem related to this procedure. Meta-heuristics techniques, such as swarm intelligence (SI) and evolutionary computing (EC), represent optimization frames with specific theories and objective functions. These methods are adjustable and have been demonstrated their effectiveness in various applications; hence, they can optimize the DNNs models. This paper presents a comprehensive survey of the recent optimization methods (i.e., SI and EC) employed to enhance DNNs performance on various tasks. This paper also analyzes the importance of optimization methods in generating the optimal hyper-parameters and structures of DNNs in taking into consideration massive-scale data. Finally, several potential directions that still need improvements and open problems in evolutionary DNNs are identified.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available