NICGSlowDown: Evaluating the Efficiency Robustness of Neural Image Caption Generation Models

Abstract

Neural image caption generation (NICG) models have received massive attention from the research community due to their excellent performance in visual understanding. Existing work focuses on improving NICG model accuracy while efficiency is less explored. However, many real-world applications require real-time feedback, which highly relies on the efficiency of the NICG models. Recent research observed that the efficiency of NICG models can vary for different inputs. This observation brings in a new attack surface of NICG models, i.e., an adversary might be able to slightly change inputs to cause the NICG models to consume more computational resources. To further understand such efficiency-oriented threats, in this paper, we propose a new attack approach NICGSlowDown, to evaluate the efficiency robustness of NICG models. Our experimental results show that NICGSlowDown can generate images with human-unnoticeable perturbations that will increase the NICG model latency up to 483%. We hope this research could raise the community’s concern about the efficiency robustness of NICG models.

Publication
In IEEE/CVF Conference on Computer Vision and Pattern Recognition.
Date