Search-and-Attack: Temporally Sparse Adversarial Perturbations on Videos

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 53
  • Download : 0
Modern neural networks are known to be vulnerable to adversarial attacks in various domains. Although most attack methods usually densely change the input values, recent works have shown that deep neural networks (DNNs) are also vulnerable to sparse perturbations. Spatially sparse attacks on images or frames of a video are proven effective but the temporally sparse perturbations on videos have been less explored. In this paper, we present a novel framework to generate a temporally sparse adversarial attack, called Search-and-Attack scheme, on videos. The Search-and-Attack scheme first retrieves the most vulnerable frames and then attacks only those frames. Since identifying the most vulnerable set of frames involves an expensive combinatorial optimization problem, we introduce alternative definitions or surrogate objective functions: Magnitude of the Gradients (MoG) and Frame-wise Robustness Intensity (FRI). Combining them with iterative search schemes, extensive experiments on three public benchmark datasets (UCF, HMDB, and Kinetics) show that the proposed method achieves comparable performance to state-of-the-art dense attack methods.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2021
Language
English
Article Type
Article
Citation

IEEE ACCESS, v.9, pp.146938 - 146947

ISSN
2169-3536
DOI
10.1109/ACCESS.2021.3124050
URI
http://hdl.handle.net/10203/311018
Appears in Collection
MA-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0