Directional Analysis of Stochastic Gradient Descent via von Mises-Fisher Distributions in Deep Learning

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 565
  • Download : 0
Although stochastic gradient descent (SGD) is a driving force behind the recent success of deep learning, our understanding of its dynamics in a high-dimensional parameter space is limited. In recent years, some researchers have used the stochasticity of minibatch gradients, or the signal-to-noise ratio, to better characterize the learning dynamics of SGD. Inspired from these work, we here analyze SGD from a geometrical perspective by inspecting the stochasticity of the norms and directions of minibatch gradients. We propose a model of the directional concentration for minibatch gradients through von Mises-Fisher (VMF) distribution, and show that the directional uniformity of minibatch gradients increases over the course of SGD. We empirically verify our result using deep convolutional networks and observe a higher correlation between the gradient stochasticity and the proposed directional uniformity than that against the gradient norm stochasticity, suggesting that the directional statistics of minibatch gradients is a major factor behind SGD.
Publisher
Neural Information Processing Systems (NIPS) Foundation
Issue Date
2018-12-08
Language
English
Citation

Thirty-second Conference on Neural Information Processing Systems

URI
http://hdl.handle.net/10203/249201
Appears in Collection
MA-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0