Enhancing Abstractiveness of Summarization Models through Calibrated Distillation

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 161
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorSong, Hwanjunko
dc.contributor.authorShalyminov, Igorko
dc.contributor.authorSu, Hangko
dc.contributor.authorSiffi, Singhko
dc.contributor.authorYao, Kaishengko
dc.contributor.authorMansour, Saabko
dc.date.accessioned2023-12-29T02:00:15Z-
dc.date.available2023-12-29T02:00:15Z-
dc.date.created2023-12-22-
dc.date.issued2023-12-10-
dc.identifier.citationThe 2023 Conference on Empirical Methods in Natural Language Processing-
dc.identifier.urihttp://hdl.handle.net/10203/317082-
dc.languageEnglish-
dc.publisherEmpirical Methods in Natural Language Processing (EMNLP)-
dc.titleEnhancing Abstractiveness of Summarization Models through Calibrated Distillation-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameThe 2023 Conference on Empirical Methods in Natural Language Processing-
dc.identifier.conferencecountrySI-
dc.identifier.conferencelocationResorts World Convention Centre, Singapore-
dc.contributor.localauthorSong, Hwanjun-
dc.contributor.nonIdAuthorShalyminov, Igor-
dc.contributor.nonIdAuthorSu, Hang-
dc.contributor.nonIdAuthorSiffi, Singh-
dc.contributor.nonIdAuthorYao, Kaisheng-
dc.contributor.nonIdAuthorMansour, Saab-
Appears in Collection
IE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0