Improving deep imbalanced classification via large-margin mix-up여분 최대화 믹스업을 통한 불균형 심층 분류의 성능 개선

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 455
  • Download : 0
A large variety of labeled datasets in practice are often severely imbalanced class-wise and it is well-known that modern deep neural networks poorly generalize from such datasets due to overfitting classes of majority training samples. To address the issue, we explore the recent state-of-the-art regularization method, called Mixup, and found that it is also very effective for class imbalanced training. Motivated by this, we propose a novel, yet simple advanced alternative, coined Boundary-Mixup. It also generates synthetic training samples by following the original Mixup principle, but our main idea beyond it is to balance the uncertainty level between classes by generating mixed samples near the decision boundary of the classifier. We demonstrate the effectiveness of Boundary-Mixup for image classification, natural language processing and continual learning tasks, improving the prior baseline methods including the original Mixup.
Advisors
Shin, Jinwooresearcher신진우researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2019
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2019.2,[iii, 18 p. :]

Keywords

data imbalance▼aregularization▼adeep learning▼amachine learning; mix-up; 데이터 불균형▼a일반화▼a딥러닝▼a머신러닝▼a믹스업

URI
http://hdl.handle.net/10203/266828
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=867969&flag=dissertation
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0