Efficient and convergent gradient methods for structured nonconvex-nonconcave minimax problems구조화된 비볼록-비오목 최소 최대화 문제를 위한 효율적이고 수렴하는 경사 방법들

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 75
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorKim, Donghwan-
dc.contributor.advisor김동환-
dc.contributor.authorLee, Sucheol-
dc.date.accessioned2023-06-22T19:33:46Z-
dc.date.available2023-06-22T19:33:46Z-
dc.date.issued2022-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=996371&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/308552-
dc.description학위논문(박사) - 한국과학기술원 : 수리과학과, 2022.2,[v, 55 p. :]-
dc.description.abstractModern minimax problems, such as generative adversarial network, adversarial training, and fair training, are usually under a nonconvex-nonconcave setting. There are mainly two types of gradient methods for solving minimax problems: single-step method and multi-step method. However, existing methods either converge slowly or are not guaranteed to converge. In specific, the best known rate for a single-step method is only O(1/k) under a considered structured nonconvex-nonconcave setting, and existing multi-step methods are not guaranteed to converge under the same setting. Therefore, this dissertation provides two new single-step and multi-step methods that have a faster rate and guarantee convergence, respectively, under the structured nonconvex-nonconcave setting. First, we propose an efficient single-step method, named fast extragradient (FEG) method, which, for the first time, achieves the optimal O(1/k$^2$) rate on the squared gradient norm, under the negative comonotonicity condition on the saddle gradient operator. Next, we propose a multi-step method, named semi-anchored multi-step gradient descent ascent (SA-MGDA) method. The SA-MGDA has O(1/k) rate on the squared gradient norm under the weak Minty variational inequality condition on the saddle gradient operator, which is weaker than the negative comonotonicity.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.titleEfficient and convergent gradient methods for structured nonconvex-nonconcave minimax problems-
dc.title.alternative구조화된 비볼록-비오목 최소 최대화 문제를 위한 효율적이고 수렴하는 경사 방법들-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :수리과학과,-
dc.contributor.alternativeauthor이수철-
Appears in Collection
MA-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0