Fine-tuning differentiable architecture search for image classification이미지 분류를 위한 미분 구조 탐색의 미세 조정 기법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 274
  • Download : 0
Neural Architecture Search (NAS) has gained attraction due to superior classification performance. Differentiable Architecture Search (DARTS) is a popular NAS method because of its computational efficiency. To limit computational resources DARTS makes numerous approximations. These approximations result in inferior performance. We propose to fine-tune DARTS using fixed operations as they are independent of these approximations. Our method offers a good trade-off between the number of parameters and classification accuracy. Furthermore, based on the influence of input on the performance of a neural network, we propose a dual-stem approach instead of a single-stem approach. Our approach improves the top-1 accuracy on Fashion-MNIST, CompCars and MIO-TCD datasets by 0.56%, 0.50%, and 0.39%, respectively compared to the state-of-the-art approaches. Our approach performs better than DARTS, improving the accuracy by 0.28%, 1.64%, 0.34%, 4.5%, and 3.27% compared to DARTS, on CIFAR-10, CIFAR-100, Fashion-MNIST, Com-pCars, and MIO-TCD datasets, respectively.
Advisors
Je, Min-Kyuresearcher제민규researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2020
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2020.8,[iv, 38 p. :]

Keywords

Neural Architecture Search▼aAttention Module▼aImage Classification▼aFine- grained Image Classification; 신경망 구조 탐색법▼a어텐션 모듈▼a이미지 분류▼a세부적인 이미지 분류

URI
http://hdl.handle.net/10203/285083
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=925247&flag=dissertation
Appears in Collection
EE-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0