GAN-mixup: Augmenting Across Data Manifolds for Improved Robustness

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 75
  • Download : 0
Data augmentation (DA), the process of injecting artificial data points in the training set, has been employed for generalization and robustness. In this paper, we propose GAN-MIXUP, a novel DA technique which uses generative models to overcome the limitation of existing DAs including mixup and its variants. GAN-MIXUP generates data points in-between the manifolds of different data classes, by making use of conditional GANs. The intuition behind GAN-MIXUP is that augmenting the training set with points that are equidistant across class manifolds will endow the classification boundary with margin, hence enabling both generalization and robustness. Our experimental results show that models trained with GAN-MIXUP can successfully defend against black/white-box attacks and generalize better compared to existing DA/defense schemes on various synthetic/real datasets.
Publisher
IEEE
Issue Date
2020-07-17
Language
English
Citation

ICML Workshop on Uncertainty & Robustness in Deep Learning, 2020

URI
http://hdl.handle.net/10203/278668
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0