Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 37
  • Download : 0
This paper optimizes the step coefficients of first-order methods for smooth convex minimization in terms of the worst-case convergence bound (i.e., efficiency) of the decrease in the gradient norm. This work is based on the performance estimation problem approach. The worst-case gradient bound of the resulting method is optimal up to a constant for large-dimensional smooth convex minimization problems, under the initial bounded condition on the cost function value. This paper then illustrates that the proposed method has a computationally efficient form that is similar to the optimized gradient method.
Publisher
SPRINGER/PLENUM PUBLISHERS
Issue Date
2021-01
Language
English
Article Type
Article
Citation

JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, v.188, no.1, pp.192 - 219

ISSN
0022-3239
DOI
10.1007/s10957-020-01770-2
URI
http://hdl.handle.net/10203/281132
Appears in Collection
MA-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0