GRAM: Graph-based Attention Model for Healthcare Representation Learning

Cited 361 time in webofscience Cited 304 time in scopus
  • Hit : 248
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorChoi, Edwardko
dc.contributor.authorBahadori, Mohammad Tahako
dc.contributor.authorSong, Leko
dc.contributor.authorStewart, Walter F.ko
dc.contributor.authorSun, Jimengko
dc.date.accessioned2020-04-22T01:20:28Z-
dc.date.available2020-04-22T01:20:28Z-
dc.date.created2020-04-06-
dc.date.issued2017-08-13-
dc.identifier.citation23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), pp.787 - 795-
dc.identifier.urihttp://hdl.handle.net/10203/273961-
dc.description.abstractDeep learning methods exhibit promising performance for predictive modeling in healthcare, but two important challenges remain: Data insufficiency: Often in healthcare predictive modeling, the sample size is insufficient for deep learning methods to achieve satisfactory results. Interpretation: The representations learned by deep learning methods should align with medical knowledge. To address these challenges, we propose GRaph-based Attention Model (GRAM) that supplements electronic health records (EHR) with hierarchical information inherent to medical ontologies. Based on the data volume and the ontology structure, GRAM represents a medical concept as a combination of its ancestors in the ontology via an attention mechanism. We compared predictive performance (i.e. accuracy, data needs, interpretability) of GRAM to various methods including the recurrent neural network (RNN) in two sequential diagnoses prediction tasks and one heart failure prediction task. Compared to the basic RNN, GRAM achieved 10% higher accuracy for predicting diseases rarely observed in the training data and 3% improved area under the ROC curve for predicting heart failure using an order of magnitude less training data. Additionally, unlike other methods, the medical concept representations learned by GRAM are well aligned with the medical ontology. Finally, GRAM exhibits intuitive attention behaviors by adaptively generalizing to higher level concepts when facing data insufficiency at the lower level concepts.-
dc.publisherASSOC COMPUTING MACHINERY-
dc.titleGRAM: Graph-based Attention Model for Healthcare Representation Learning-
dc.typeConference-
dc.identifier.wosid000455787300092-
dc.type.rimsCONF-
dc.citation.beginningpage787-
dc.citation.endingpage795-
dc.citation.publicationname23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD)-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationHalifax, CANADA-
dc.identifier.doi10.1145/3097983.3098126-
dc.contributor.nonIdAuthorBahadori, Mohammad Taha-
dc.contributor.nonIdAuthorSong, Le-
dc.contributor.nonIdAuthorStewart, Walter F.-
dc.contributor.nonIdAuthorSun, Jimeng-
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 361 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0