XtarNet: Learning to Extract Task-Adaptive Representation for Incremental Few-Shot Learning

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 22
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorYoon, Sung Whanko
dc.contributor.authorMoon, Jaekyunko
dc.contributor.authorKim, Do Yeonko
dc.contributor.authorSeo, Junko
dc.identifier.citationInternational Conference on Machine Learning (ICML) 2020-
dc.description.abstractLearning novel concepts while preserving prior knowledge is a long-standing challenge in machine learning. The challenge gets greater when a novel task is given with only a few labeled examples, a problem known as incremental few-shot learning. We propose XtarNet, which learns to extract task-adaptive representation (TAR) for facilitating incremental few-shot learning. The method utilizes a backbone network pretrained on a set of base categories while also employing additional modules that are meta-trained across episodes. Given a new task, the novel feature extracted from the meta-trained modules is mixed with the base feature obtained from the pretrained model. The process of combining two different features provides TAR and is also controlled by meta-trained modules. The TAR contains effective information for classifying both novel and base categories. The base and novel classifiers quickly adapt to a given task by utilizing the TAR. Experiments on standard image datasets indicate that XtarNet achieves state-of-the-art incremental few-shot learning performance. The concept of TAR can also be used in conjunction with existing incremental few-shot learning methods; extensive simulation results in fact show that applying TAR enhances the known methods significantly.-
dc.titleXtarNet: Learning to Extract Task-Adaptive Representation for Incremental Few-Shot Learning-
dc.citation.publicationnameInternational Conference on Machine Learning (ICML) 2020-
dc.contributor.localauthorMoon, Jaekyun-
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.


  • mendeley


rss_1.0 rss_2.0 atom_1.0