Efficient and convergent gradient methods for structured nonconvex-nonconcave minimax problems구조화된 비볼록-비오목 최소 최대화 문제를 위한 효율적이고 수렴하는 경사 방법들

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 66
  • Download : 0
Modern minimax problems, such as generative adversarial network, adversarial training, and fair training, are usually under a nonconvex-nonconcave setting. There are mainly two types of gradient methods for solving minimax problems: single-step method and multi-step method. However, existing methods either converge slowly or are not guaranteed to converge. In specific, the best known rate for a single-step method is only O(1/k) under a considered structured nonconvex-nonconcave setting, and existing multi-step methods are not guaranteed to converge under the same setting. Therefore, this dissertation provides two new single-step and multi-step methods that have a faster rate and guarantee convergence, respectively, under the structured nonconvex-nonconcave setting. First, we propose an efficient single-step method, named fast extragradient (FEG) method, which, for the first time, achieves the optimal O(1/k$^2$) rate on the squared gradient norm, under the negative comonotonicity condition on the saddle gradient operator. Next, we propose a multi-step method, named semi-anchored multi-step gradient descent ascent (SA-MGDA) method. The SA-MGDA has O(1/k) rate on the squared gradient norm under the weak Minty variational inequality condition on the saddle gradient operator, which is weaker than the negative comonotonicity.
Advisors
Kim, Donghwanresearcher김동환researcher
Description
한국과학기술원 :수리과학과,
Publisher
한국과학기술원
Issue Date
2022
Identifier
325007
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 수리과학과, 2022.2,[v, 55 p. :]

URI
http://hdl.handle.net/10203/308552
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=996371&flag=dissertation
Appears in Collection
MA-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0