4.3 Article

Evolutionary approximation and neural architecture search

Journal

GENETIC PROGRAMMING AND EVOLVABLE MACHINES
Volume 23, Issue 3, Pages 351-374

Publisher

SPRINGER
DOI: 10.1007/s10710-022-09441-z

Keywords

Approximate computing; Convolutional neural network; Cartesian genetic programming; Neuroevolution; Energy efficiency

Funding

  1. Czech Science Foundation [21-13001S]
  2. Ministry of Education, Youth and Sports from the Large Infrastructures for Research, Experimental Development and Innovations project e-Infrastructure [CZ-LM2018140]

Ask authors/readers for more resources

Automated neural architecture search (NAS) methods can deliver high-quality neural network architectures and reduce the designer's effort. The proposed multi-objective NAS method based on Cartesian genetic programming evolves convolutional neural networks (CNN) to reduce power consumption while maintaining accuracy and network size. The method automatically selects the most suitable approximate multipliers for the target hardware implementation.
Automated neural architecture search (NAS) methods are now employed to routinely deliver high-quality neural network architectures for various challenging data sets and reduce the designer's effort. The NAS methods utilizing multi-objective evolutionary algorithms are especially useful when the objective is not only to minimize the network error but also to reduce the number of parameters (weights) or power consumption of the inference phase. We propose a multi-objective NAS method based on Cartesian genetic programming for evolving convolutional neural networks (CNN). The method allows approximate operations to be used in CNNs to reduce the power consumption of a target hardware implementation. During the NAS process, a suitable CNN architecture is evolved together with selecting approximate multipliers to deliver the best trade-offs between accuracy, network size, and power consumption. The most suitable 8 x N-bit approximate multipliers are automatically selected from a library of approximate multipliers. Evolved CNNs are compared with CNNs developed by other NAS methods on the CIFAR-10 and SVHN benchmark problems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available