Ada-boundary: accelerating DNN training via adaptive boundary batch selection

Cited 8 time in webofscience Cited 4 time in scopus
  • Hit : 360
  • Download : 234
Neural networks converge faster with help from a smart batch selection strategy. In this regard, we proposeAda-Boundary, a novel and simple adaptive batch selection algorithm that constructs an effective mini-batch according to the learning progress of the model. Our key idea is to exploitconfusingsamples for which the model cannot predict labels with high confidence. Thus, samples near the current decision boundary are considered to be the most effective for expediting convergence. Taking advantage of this design,Ada-Boundarymaintained its dominance for various degrees of training difficulty. We demonstrate the advantage ofAda-Boundaryby extensive experimentation using CNNs with five benchmark data sets.Ada-Boundarywas shown to produce a relative improvement in test errors by up to 31.80% compared with the baseline for a fixed wall-clock training time, thereby achieving a faster convergence speed.
Publisher
SPRINGER
Issue Date
2020-09
Language
English
Article Type
Article
Citation

MACHINE LEARNING, v.109, no.9, pp.1837 - 1853

ISSN
0885-6125
DOI
10.1007/s10994-020-05903-6
URI
http://hdl.handle.net/10203/277305
Appears in Collection
IE-Journal Papers(저널논문)CS-Journal Papers(저널논문)
Files in This Item
000566060200004.pdf(2.54 MB)Download
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 8 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0