4.7 Article

Adversarial color projection: A projector-based physical-world attack to DNNs

Journal

IMAGE AND VISION COMPUTING
Volume 140, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.imavis.2023.104861

Keywords

DNNs; Black -box projector-based physical attack; Adversarial color projection; Effectiveness; Stealthiness; Robustness

Ask authors/readers for more resources

This study introduces a black-box projector-based physical attack called AdvCP, which manipulates the physical parameters of color projection to execute adversarial attacks. The method achieves remarkable attack success rates in both digital and physical environments.
While deep neural networks (DNNs) have made remarkable advancements in various fields recently, the latest research indicates that DNNs are susceptible to disruptions from minor perturbations. However, conventional physical attacks employing stickers as physical perturbations to deceive classifiers encounter challenges in achieving stealthiness and are susceptible to issues such as printing quality loss. Recent advancements in physical attacks have harnessed light beams to execute attacks, producing artificial optical patterns rather than natural ones. In this study, we introduce a black-box projector-based physical attack called Adversarial Color Projection (AdvCP), which manipulates the physical parameters of color projection to execute adversarial attacks. AdvCP revolves around three pivotal criteria: effectiveness, stealthiness, and robustness. In a digital environment, our approach attains an impressive attack success rate of 97.60% on a subset of ImageNet.In the physical realm, we achieve a remarkable 100% attack success rate in indoor testing and 82.14% in outdoor testing. To underscore the stealthiness of our approach, we juxtapose the adversarial samples generated by AdvCP with baseline samples. When applied to challenge advanced and robust DNNs, our experimental results reveal that our method achieves an attack success rate exceeding 85% across most all of the models, establishing the robustness of AdvCP. Finally, we contemplate the potential threats posed by AdvCP to future vision-based systems and applications, and proffer some innovative concepts pertaining to light-based physical attacks. Our code can be accessed from the following link: https://github.com/ChengYinHu/AdvCP.git

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available