Evolutionary optimization for generalized mixture neural network architecture일반화된 혼합 신경 회로망 구조에 대한 진화 연산 최적화

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 327
  • Download : 0
Theoretical studies have shown that neural networks are capable of approximating any continuous function on a compact domain to any degree of accuracy. However, constructing a optimal neural network needs to find a good tradeoff between fitting the training data and keeping the network architecture simple. A simple neural network is not only more comprehensive and more manageable, but also less likely to overfit the training data, that is, has a higher capacity to generalize the unexperienced data. Up to now, architecture design of neural networks has been a human expert``s job for the most part. It depends heavily on the expert experience and it often needs a tedious trial-and-error process. Although considerable research on growing and pruning methods has pursued the automatic design of architectures, most works are susceptible to becoming trapped at structural local minima and those investigations are likely to be restricted to a topological subsets rather than the complete class of network architectures. Furthermore, few principled approaches exist in the literature because of the lack of a well-defined optimality criterion. In this thesis, a promising solution, a new evolutionary optimization method is proposed to provide a general, systematic, and unified way to design an optimal neural network of the generalized mixture neural network class. The generalized mixture neural network class is newly defined here to include various kinds of neural networks with a linear-in-the-parameters structure. In the evolutionary process, a use of statistical information criteria in the fitness evaluation provides a principled way of finding an optimal tradeoff between the two conflicting modelling objectives. Furthermore, a practically available Pareto optimal selection is presented to provide the "best practically available networks" for the developers of neural networks, instead of recommending the only one network regarded as the best one. Therefore, the developers ca...
Advisors
Lee, Ju-Jangresearcher이주장researcher
Description
한국과학기술원 : 전기및전자공학전공,
Publisher
한국과학기술원
Issue Date
2002
Identifier
174622/325007 / 000955421
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 전기및전자공학전공, 2002.2, [ xii, 110 p. ]

Keywords

evolutionary optimization; generalized mixture neural network; neural network structure optimization; 신경 회로망 구조 최적화; 진화 연산; 혼합 신경 회로망

URI
http://hdl.handle.net/10203/35989
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=174622&flag=dissertation
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0