Browse "Kim Jaechul Graduate School of AI(김재철AI대학원)" by Author Jang, Joel

Showing results 1 to 7 of 7

1
Exploring the Benefits of Training Expert Language Models over Instruction Tuning

Jang, Joel; Kim, Seungone; Ye, Seonghyeon; Kim, Doyoung; Logeswaran, Lajanugen; Lee, Moontae; Lee, Kyungjae; et al, ICML 2023, pp.14702 - 14729, International Machine Learning Society (IMLS), 2023-07

2
Fixed Input Parameterization for Efficient Prompting

Choi, Eunbi; Jo, Yongrae; Jang, Joel; Jang, Joonwon; Seo, Minjoon, ACL 2023, pp.8428 - 8441, Association for Computational Linguistics (ACL), 2023-07

3
Gradient Ascent Post-training Enhances Language Model Generalization

Yoon, Dongkeun; Jang, Joel; Kim, Sungdong; Seo, Minjoon, ACL 2023, pp.851 - 864, Association for Computational Linguistics (ACL), 2023-07

4
Knowledge Unlearning for Mitigating Privacy Risks in Language Models

Jang, Joel; Yoon, Dongkeun; Yang, Sohee; Cha, Sungmin; Lee, Moontae; Logeswaran, Lajanugen; Seo, Minjoon, ACL 2023, pp.14389 - 14408, Association for Computational Linguistics (ACL), 2023-07

5
TemporalWiki: A Lifelong Benchmark for Training and Evaluating Ever-Evolving Language Models

Jang, Joel; Ye, Seonghyeon; Lee, Changho; Yang, Sohee; Shin, Joongbo; Han, Janghoon; Kim, Gyeonghun; et al, EMNLP 2022, pp.6237 - 6250, Association for Computational Linguistics (ACL), 2022-12

6
Towards Continual Knowledge Learning of Language Models

Jang, Joel; Ye, Seonghyeon; Yang, Sohee; Shin, Joongbo; Han, Janghoon; Kim, Gyeonghun; Choi, Jungkyu; et al, ICLR 2022, International Conference on Learning Representations (ICLR), 2022-04-27

7
Towards continual knowledge learning of language models = 언어 모델의 지속적인 지식 학습link

Jang, Joel; 장요엘; et al, 한국과학기술원, 2023

rss_1.0 rss_2.0 atom_1.0