Dataset distillation via loss approximation for continual learning지속학습을 위한 손실함수 근사를 통한 데이터셋 증류법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 96
  • Download : 0
Current neural networks achieved human-comparable performance in many offline tasks. However, since the training environment is not static in the real world, it breaks the i.i.d assumption in Empirical Risk Minimization. Continual learning is a research area that aims to make the neural network can train in a dynamically changing training environment. Continual learning assumes that the previously seen data is not used again to check that the neural network is well retaining previous knowledge during the training environment change. The basic way in continual learning is to additionally train the new dataset on a trained model, but this leads to a so-called catastrophic forgetting in which the performance of previously learned data is drastically decreased. Many previous works point out the absence of previous loss function and they modeled the surrogate loss to approximate it. Although previous works achieved the alleviation of catastrophic forgetting, they have several limitations such as violating the assumption, unintentional change of surrogate loss, and need large additional cost. In this thesis, we propose a new approach to overcoming catastrophic forgetting, Dataset Distillation for Continual learning (D2CL), which can avoid the above shortcomings. D2CL train the small synthetic dataset which well approximates the loss function of the original dataset. This thesis also introduces a new proposition that is useful for loss approximation. Finally, it is validated through several experimental results that our proposed method help to store more informative data while using small additional memory.
Advisors
Moon, Il-Chulresearcher문일철researcher
Description
한국과학기술원 :산업및시스템공학과,
Publisher
한국과학기술원
Issue Date
2022
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 산업및시스템공학과, 2022.2,[iii, 28 p. :]

URI
http://hdl.handle.net/10203/308784
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=997791&flag=dissertation
Appears in Collection
IE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0