Optimal convergence rate of without-replacement SGD비복원 추출 기반 확률적 경사 하강법의 최적 수렴 속도 분석

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 6
  • Download : 0
We study convergence lower bounds of without-replacement stochastic gradient descent (SGD) for solving smooth (strongly-)convex finite-sum minimization problems. Unlike most existing results focusing on lower bounds for SGD with Random Reshuffling where the random permutation is chosen independently on each epoch to sample the component function in each iterate, we seek bounds applicable to arbitrary permutation-based SGD. This method includes all variants that carefully select the best permutation beyond Random Reshuffling. Notably, our bounds significantly improve upon existing lower bounds and tightly match the upper bounds shown for a recently proposed algorithm called GraB, across all factors including the number of component functions $n$, the number of epochs $K$, and the condition number $\kappa$. We also extend our bounds to arbitrary weighted average iterates, providing a generalized result that covers the final iterate.
Advisors
윤철희researcher
Description
한국과학기술원 :김재철AI대학원,
Publisher
한국과학기술원
Issue Date
2024
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2024.2,[iv, 45 p. :]

Keywords

확률적 경사 하강법▼a수렴 속도 하한▼a비복원 추출; Stochastic gradient descent▼aConvergence lower bound▼aWithout-replacement

URI
http://hdl.handle.net/10203/321374
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1096079&flag=dissertation
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0