DistiLLM: Towards Streamlined Distillation for Large Language Models

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 10
  • Download : 0
Publisher
ICML
Issue Date
2024-07-23
Citation

The Forty-first International Conference on Machine Learning

URI
http://hdl.handle.net/10203/320261
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0