Co2L: contrastive continual learningCo2L: 대조적 학습 기법을 통한 연속 학습

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 132
  • Download : 0
Recent breakthroughs in self-supervised learning show that such algorithms learn visual representations that can be transferred better to unseen tasks than joint-training methods relying on task-specific supervision. In this paper, we found that the similar holds in the continual learning context: contrastively learned representations are more robust against the catastrophic forgetting than jointly trained representations. Based on this novel observation, we propose a rehearsal-based continual learning algorithm that focuses on continually learning and maintaining transferable representations. More specifically, the proposed scheme (1) learns representations using the contrastive learning objective, and (2) preserves learned representations using a self-supervised distillation step. We conduct extensive experimental validations under popular benchmark image classification datasets, where our method sets the new state-of-the-art performance.
Advisors
Shin, Jinwooresearcher신진우researcher
Description
한국과학기술원 :AI대학원,
Publisher
한국과학기술원
Issue Date
2021
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : AI대학원, 2021.8,[iv, 26 p. :]

Keywords

continual learning▼acontrastive learning▼aself-supervised learning▼arepresentation learning▼atransfer learning; 연속 학습▼a대조적 학습▼a자기 지도 학습▼a표현 학습▼a전이 학습

URI
http://hdl.handle.net/10203/292499
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=963749&flag=dissertation
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0