4.5 Article

Knowledge distillation circumvents nonlinearity for optical convolutional neural networks

Journal

APPLIED OPTICS
Volume 61, Issue 9, Pages 2173-2183

Publisher

Optica Publishing Group
DOI: 10.1364/AO.435738

Keywords

-

Categories

Funding

  1. National Science Foundation HDR Institute Accelerated AI Algorithms for Data-Driven Discovery [OAC-2117997]
  2. Washington Research Foundation Fund
  3. Departments of AppliedMathematics
  4. eScience Center at the University of Washington
  5. National Science Foundation [2127235]
  6. UW Reality Lab
  7. Div Of Electrical, Commun & Cyber Sys
  8. Directorate For Engineering [2127235] Funding Source: National Science Foundation

Ask authors/readers for more resources

This paper proposes a network architecture of convolutional neural networks implemented in an optical manner, and uses a knowledge distillation approach to address the issue of nonlinear layers, achieving more efficient image processing tasks.
In recent years, convolutional neural networks (CNNs) have enabled ubiquitous image processing applications. As such, CNNs require fast forward propagation runtime to process high-resolution visual streams in real time. This is still a challenging task even with state-of-the-art graphics and tensor processing units. The bottleneck in computational efficiency primarily occurs in the convolutional layers. Performing convolutions in the Fourier domain is a promising way to accelerate forward propagation since it transforms convolutions into elementwise multiplications, which are considerably faster to compute for large kernels. Furthermore, such computation could be implemented using an optical 4f system with orders of magnitude faster operation. However, a major challenge in using this spectral approach, as well as in an optical implementation of CNNs, is the inclusion of a nonlinearity between each convolutional layer, without which CNN performance drops dramatically. Here, we propose a spectral CNN linear counterpart (SCLC) network architecture and its optical implementation. We propose a hybrid platform with an optical front end to perform a large number of linear operations, followed by an electronic back end. The key contribution is to develop a knowledge distillation (KD) approach to circumvent the need for nonlinear layers between the convolutional layers and successfully train such networks. While the KD approach is known in machine learning as an effective process for network pruning, we adapt the approach to transfer the knowledge from a nonlinear network (teacher) to a linear counterpart (student), where we can exploit the inherent parallelism of light. We show that the KD approach can achieve performance that easily surpasses the standard linear version of a CNN and could approach the performance of the nonlinear network. Our simulations show that the possibility of increasing the resolution of the input image allows our proposed 4f optical linear network to perform more efficiently than a nonlinear network with the same accuracy on two fundamental image processing tasks: (i) object classification and (ii) semantic segmentation. (C) 2022 Optical Society of America

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available