Deep learning-based household electric energy consumption forecasting

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 443
  • Download : 331
DC FieldValueLanguage
dc.contributor.authorHyeon, Jonghwanko
dc.contributor.authorLee, HyeYoungko
dc.contributor.authorKo, Bowonko
dc.contributor.authorChoi, Ho-Jinko
dc.date.accessioned2020-10-08T01:55:19Z-
dc.date.available2020-10-08T01:55:19Z-
dc.date.created2020-09-21-
dc.date.created2020-09-21-
dc.date.created2020-09-21-
dc.date.issued2020-07-
dc.identifier.citationJOURNAL OF ENGINEERING-JOE, v.2020, no.13, pp.639 - 642-
dc.identifier.issn2051-3305-
dc.identifier.urihttp://hdl.handle.net/10203/276487-
dc.description.abstractWith the advent of various electronic products, the household electric energy consumption is continuously increasing, and therefore it becomes very important to predict the household electric energy consumption accurately. Energy prediction models also have been developed for decades with advanced machine learning technologies. Meanwhile, the deep learning models are still actively under study, and many newer models show the state-of-the-art performance. Therefore, it would be meaningful to conduct the same experiment with these new models. Here, the authors predict the household electric energy consumption using deep learning models, known to be suitable for dealing with time-series data. Specifically, vanilla long short-term memory (LSTM), sequence to sequence, and sequence to sequence with attention mechanism are used to predict the electric energy consumption in the household. As a result, the vanilla LSTM shows the best performance on the root-mean-square error metric. However, from a graphical point of view, it seems that the sequence-to-sequence model predicts the energy consumption patterns best and the vanilla LSTM does not follow the pattern well. Also, to achieve the best performance of each deep learning model, vanilla LSTM, sequence to sequence, and sequence to sequence with attention mechanism should observe past 72, 72, and 24 h, respectively.-
dc.languageEnglish-
dc.publisherINST ENGINEERING TECHNOLOGY-IET-
dc.titleDeep learning-based household electric energy consumption forecasting-
dc.typeArticle-
dc.type.rimsART-
dc.citation.volume2020-
dc.citation.issue13-
dc.citation.beginningpage639-
dc.citation.endingpage642-
dc.citation.publicationnameJOURNAL OF ENGINEERING-JOE-
dc.identifier.doi10.1049/joe.2019.1219-
dc.contributor.localauthorChoi, Ho-Jin-
dc.contributor.nonIdAuthorLee, HyeYoung-
dc.contributor.nonIdAuthorKo, Bowon-
dc.description.isOpenAccessY-
dc.type.journalArticleArticle; Proceedings Paper-
dc.subject.keywordAuthorpower consumption-
dc.subject.keywordAuthortime series-
dc.subject.keywordAuthorlearning (artificial intelligence)-
dc.subject.keywordAuthormean square error methods-
dc.subject.keywordAuthorload forecasting-
dc.subject.keywordAuthorpower engineering computing-
dc.subject.keywordAuthorrecurrent neural nets-
dc.subject.keywordAuthorvanilla LSTM-
dc.subject.keywordAuthordeep learning model-
dc.subject.keywordAuthorenergy prediction models-
dc.subject.keywordAuthoradvanced machine learning technologies-
dc.subject.keywordAuthorsequence-to-sequence model-
dc.subject.keywordAuthorenergy consumption patterns-
dc.subject.keywordAuthorhousehold electric energy consumption forecasting-
dc.subject.keywordAuthortime-series data-
dc.subject.keywordAuthorvanilla long short-term memory-
dc.subject.keywordAuthorroot-mean-square error metric-
dc.subject.keywordAuthorsequence to sequence with attention mechanism-
dc.subject.keywordAuthortime 72-
dc.subject.keywordAuthor0 hour-
dc.subject.keywordAuthortime 24-
dc.subject.keywordAuthor0 hour-

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0