Efficient Contrastive Learning via Novel Data Augmentation and Curriculum Learning

Cited 3 time in webofscience Cited 0 time in scopus
  • Hit : 117
  • Download : 0
We introduce EfficientCL, a memory-efficient continual pretraining method that applies contrastive learning with novel data augmentation and curriculum learning. For data augmentation, we stack two types of operation sequentially: cutoff and PCA jittering. While pretraining steps proceed, we apply curriculum learning by incrementing the augmentation degree for each difficulty step. After data augmentation is finished, contrastive learning is applied on projected embeddings of original and augmented examples. When finetuned on GLUE benchmark, our model outperforms baseline models, especially for sentence-level tasks. Additionally, this improvement is capable with only 70% of computational memory compared to the baseline model.
Publisher
Empirical Methods in Natural Language Processing (EMNLP 2021)
Issue Date
2021-11
Language
English
Citation

Conference on Empirical Methods in Natural Language Processing (EMNLP), pp.1832 - 1838

URI
http://hdl.handle.net/10203/289001
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 3 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0