Optimized first-order methods for smooth convex minimization

Cited 91 time in webofscience Cited 0 time in scopus
  • Hit : 437
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Donghwanko
dc.contributor.authorFessler, Jeffrey A.ko
dc.date.accessioned2018-09-18T05:53:45Z-
dc.date.available2018-09-18T05:53:45Z-
dc.date.created2018-08-21-
dc.date.created2018-08-21-
dc.date.issued2016-09-
dc.identifier.citationMATHEMATICAL PROGRAMMING, v.159, no.1-2, pp.81 - 107-
dc.identifier.issn0025-5610-
dc.identifier.urihttp://hdl.handle.net/10203/245453-
dc.description.abstractWe introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1-2):451-482, 2014. doi: 10.1007/s10107-013-0653-0"10.1007/s10107-013-0653-0" TargetType=) recently described a numerical method for computing the N-iteration optimal step coefficients in a class of first-order algorithms that includes gradient methods, heavy-ball methods (Polyak in USSR Comput Math Math Phys 4(5):1-17, 1964. doi: 10.1016/0041-5553(64)90137-510.1016/0041-5553(64)90137-5" TargetType="DOI), and Nesterov's fast gradient methods (Nesterov in Sov Math Dokl 27(2):372-376, 1983; Math Program 103(1):127-152, 2005. doi: 10.1007/s10107-004-0552-510.1007/s10107-004-0552-5" TargetType=). However, the numerical method in Drori and Teboulle (2014) is computationally expensive for large N, and the corresponding numerically optimized first-order algorithm in Drori and Teboulle (2014) requires impractical memory and computation for large-scale optimization problems. In this paper, we propose optimized first-order algorithms that achieve a convergence bound that is two times smaller than for Nesterov's fast gradient methods; our bound is found analytically and refines the numerical bound in Drori and Teboulle (2014). Furthermore, the proposed optimized first-order methods have efficient forms that are remarkably similar to Nesterov's fast gradient methods.-
dc.languageEnglish-
dc.publisherSPRINGER HEIDELBERG-
dc.titleOptimized first-order methods for smooth convex minimization-
dc.typeArticle-
dc.identifier.wosid000382053900003-
dc.identifier.scopusid2-s2.0-84945208658-
dc.type.rimsART-
dc.citation.volume159-
dc.citation.issue1-2-
dc.citation.beginningpage81-
dc.citation.endingpage107-
dc.citation.publicationnameMATHEMATICAL PROGRAMMING-
dc.identifier.doi10.1007/s10107-015-0949-3-
dc.contributor.localauthorKim, Donghwan-
dc.contributor.nonIdAuthorFessler, Jeffrey A.-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorFirst-order algorithms-
dc.subject.keywordAuthorConvergence bound-
dc.subject.keywordAuthorSmooth convex minimization-
dc.subject.keywordAuthorFast gradient methods-
Appears in Collection
MA-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 91 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0