Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 236
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Cheolhyoungko
dc.contributor.authorCho, Kyunghyunko
dc.contributor.authorKang, Wanmoko
dc.date.accessioned2021-06-25T05:50:25Z-
dc.date.available2021-06-25T05:50:25Z-
dc.date.created2021-06-22-
dc.date.created2021-06-22-
dc.date.issued2020-04-30-
dc.identifier.citationInternational Conference on Learning Representations (ICLR)-
dc.identifier.urihttp://hdl.handle.net/10203/286240-
dc.description.abstractIn natural language processing, it has been observed recently that generalization could be greatly improved by finetuning a large-scale language model pretrained on a large unlabeled corpus. Despite its recent success and wide adoption, finetuning a large pretrained language model on a downstream task is prone to degenerate performance when there are only a small number of training instances available. In this paper, we introduce a new regularization technique, to which we refer as “mixout”, motivated by dropout. Mixout stochastically mixes the parameters of two models. We show that our mixout technique regularizes learning to minimize the deviation from one of the two models and that the strength of regularization adapts along the optimization trajectory. We empirically evaluate the proposed mixout and its variants on finetuning a pretrained language model on downstream tasks. More specifically, we demonstrate that the stability of finetuning and the average accuracy greatly increase when we use the proposed approach to regularize finetuning of BERT on downstream tasks in GLUE.-
dc.languageEnglish-
dc.publisherInternational Conference on Learning Representations-
dc.titleMixout: Effective Regularization to Finetune Large-scale Pretrained Language Models-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameInternational Conference on Learning Representations (ICLR)-
dc.identifier.conferencecountryET-
dc.identifier.conferencelocationVirtual-
dc.contributor.localauthorKang, Wanmo-
dc.contributor.nonIdAuthorLee, Cheolhyoung-
dc.contributor.nonIdAuthorCho, Kyunghyun-
Appears in Collection
MA-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0