Evolution-based hessian approximation for hybrid numerical optimization methods = 하이브리드 수치최적화 기법을 위한 진화기반 헤시안 추정

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 364
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorTahk, Min-Jea-
dc.contributor.advisor탁민제-
dc.contributor.authorPark, Moon-Su-
dc.contributor.author박문수-
dc.date.accessioned2011-12-12T07:01:55Z-
dc.date.available2011-12-12T07:01:55Z-
dc.date.issued2009-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=327802&flag=dissertation-
dc.identifier.urihttp://hdl.handle.net/10203/26386-
dc.description학위논문(박사) - 한국과학기술원 : 항공우주공학전공, 2009. 8., [ vii, 136 p. ]-
dc.description.abstractIn this dissertation, Hessian approximation algorithms are proposed to estimate the search direction of the quasi-Newton methods for solving optimization problems of continuous parameters. The proposed algorithms are quite different from other well-known quasi-Newton methods such as Symmetric Rank-One(SRO), Davidon-Fletcher-Powell(DFP) and Broyden-Fletcher-Goldfarb-Shanno(BFGS) in that the Hessian matrix is not calculated from the gradient information but from the function values directly. The proposed algorithms are designed for a class of hybrid methods that combine evolutionary search with the gradient-based methods of quasi-Newton type. The function values calculated for the evolutionary search are used for estimation of the Hessian matrix (or its inverse) as well as the gradient vector. The Hessian matrix is recursively corrected so that its curvature matches the curvature information found from the fitness values. For this purpose, two different update methods, one called batch update and the other sequential update, are proposed. Furthermore, convergence properties of repetitive application of the proposed update schemes are investigated, both analytically and numerically. Since the estimation process of the Hessian matrix is independent of that of the gradient vector, more reliable Hessian estimation with a sparse population is possible compared with the previous methods based upon the classical quasi-Newton methods. Numerical experiments show that the proposed algorithms are very competitive with regards to state-of-the-art evolutionary algorithms for continuous optimization problems.eng
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectNumerical Optimization Method-
dc.subjectHybrid Algorithm-
dc.subjectEvolutionary Computation-
dc.subjectHessian Approximation-
dc.subjectQuasi-Newton Method-
dc.subject수치최적화 기법-
dc.subject하이브리드 알고리즘-
dc.subject진화 연산-
dc.subject헤시안 근사-
dc.subject의사-뉴튼 방법-
dc.subjectNumerical Optimization Method-
dc.subjectHybrid Algorithm-
dc.subjectEvolutionary Computation-
dc.subjectHessian Approximation-
dc.subjectQuasi-Newton Method-
dc.subject수치최적화 기법-
dc.subject하이브리드 알고리즘-
dc.subject진화 연산-
dc.subject헤시안 근사-
dc.subject의사-뉴튼 방법-
dc.titleEvolution-based hessian approximation for hybrid numerical optimization methods = 하이브리드 수치최적화 기법을 위한 진화기반 헤시안 추정-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN327802/325007 -
dc.description.department한국과학기술원 : 항공우주공학전공, -
dc.identifier.uid020045092-
dc.contributor.localauthorTahk, Min-Jea-
dc.contributor.localauthor탁민제-
Appears in Collection
AE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0