CSI: Novelty Detection via Contrastive Learning on Distributionally Shifted Instances

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 472
  • Download : 0
Novelty detection, i.e., identifying whether a given sample is drawn from outside the training distribution, is essential for reliable machine learning. To this end, there have been many attempts at learning a representation well-suited for novelty detection and designing a score based on such representation. In this paper, we propose a simple, yet effective method named contrasting shifted instances (CSI), inspired by the recent success on contrastive learning of visual representations. Specifically, in addition to contrasting a given sample with other instances as in conventional contrastive learning methods, our training scheme contrasts the sample with distributionally-shifted augmentations of itself. Based on this, we propose a new detection score that is specific to the proposed training scheme. Our experiments demonstrate the superiority of our method under various novelty detection scenarios, including unlabeled one-class, unlabeled multi-class and labeled multi-class settings, with various image benchmark datasets. Code and pre-trained models are available at this https URL.
Publisher
Neural Information Processing Systems
Issue Date
2020-12-07
Language
English
Citation

34th Conference on Neural Information Processing Systems (NeurIPS) 2020

URI
http://hdl.handle.net/10203/278229
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0