DropMax: Adaptive Variational Softmax

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 195
  • Download : 0
We propose DropMax, a stochastic version of softmax classifier which at each iteration drops non-target classes according to dropout probabilities adaptively decided for each instance. Specifically, we overlay binary masking variables over class output probabilities, which are input-adaptively learned via variational inference. This stochastic regularization has an effect of building an ensemble classifier out of exponentially many classifiers with different decision boundaries. Moreover, the learning of dropout rates for non-target classes on each instance allows the classifier to focus more on classification against the most confusing classes. We validate our model on multiple public datasets for classification, on which it obtains significantly improved accuracy over the regular softmax classifier and other baselines. Further analysis of the learned dropout probabilities shows that our model indeed selects confusing classes more often when it performs classification.
Publisher
NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)
Issue Date
2018-12
Language
English
Citation

32nd Conference on Neural Information Processing Systems (NIPS)

ISSN
1049-5258
URI
http://hdl.handle.net/10203/274728
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0