A new global optimization method for univariate constrained twice-differentiable NLP problems

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 680
  • Download : 0
In this paper, a new global optimization method is proposed for an optimization problem with twice-differentiable objective and constraint functions of a single variable. The method employs a difference of convex underestimator and a convex cut function, where the former is a continuous piecewise concave quadratic function, and the latter is a convex quadratic function. The main objectives of this research are to determine a quadratic concave underestimator that does not need an iterative local optimizer to determine the lower bounding value of the objective function and to determine a convex cut function that effectively detects infeasible regions for nonconvex constraints. The proposed method is proven to have a finite epsilon-convergence to locate the global optimum point. The numerical experiments indicate that the proposed method competes with another covering method, the index branch-and-bound algorithm, which uses the Lipschitz constant.
Publisher
SPRINGER
Issue Date
2007-09
Language
English
Article Type
Article
Keywords

ALPHA-BB; CONVEX UNDERESTIMATORS; MULTIEXTREMAL CONSTRAINTS; INTERVAL-ANALYSIS; PROCESS DESIGN; ALGORITHM; SYSTEMS; MINLPS

Citation

JOURNAL OF GLOBAL OPTIMIZATION, v.39, no.1, pp.79 - 100

ISSN
0925-5001
DOI
10.1007/s10898-006-9121-1
URI
http://hdl.handle.net/10203/90007
Appears in Collection
CBE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0