Evolving mixture of experts for nonlinear time series modelling and prediction

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 387
  • Download : 10
DC FieldValueLanguage
dc.contributor.authorHong, SGko
dc.contributor.authorOh, SKko
dc.contributor.authorKim, MSko
dc.contributor.authorLee, Ju-Jangko
dc.date.accessioned2009-01-12T02:08:05Z-
dc.date.available2009-01-12T02:08:05Z-
dc.date.created2012-02-06-
dc.date.created2012-02-06-
dc.date.issued2002-01-
dc.identifier.citationELECTRONICS LETTERS, v.38, no.1, pp.34 - 35-
dc.identifier.issn0013-5194-
dc.identifier.urihttp://hdl.handle.net/10203/8281-
dc.description.abstractThe evolutionary structure optimisation (ESO) method for Gaussian radial basis function (RBF) networks has already been presented by the authors. Here, they improve the ESO method in its mutation operator and apply it to a mixture of experts (ME) for modelling and predicting nonlinear time series. The ME implementation provides much better generalisation performance with fewer network parameters, compared to the Gaussian RBF networks.-
dc.languageEnglish-
dc.language.isoen_USen
dc.publisherIEE-INST ELEC ENG-
dc.titleEvolving mixture of experts for nonlinear time series modelling and prediction-
dc.typeArticle-
dc.identifier.wosid000173508400023-
dc.identifier.scopusid2-s2.0-0037012114-
dc.type.rimsART-
dc.citation.volume38-
dc.citation.issue1-
dc.citation.beginningpage34-
dc.citation.endingpage35-
dc.citation.publicationnameELECTRONICS LETTERS-
dc.embargo.liftdate9999-12-31-
dc.embargo.terms9999-12-31-
dc.contributor.localauthorLee, Ju-Jang-
dc.contributor.nonIdAuthorHong, SG-
dc.contributor.nonIdAuthorOh, SK-
dc.contributor.nonIdAuthorKim, MS-
dc.type.journalArticleArticle-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0