Bridging the gap between deterministic and probabilistic recurrent neural networks in a predictive coding framework = 예측코딩 기반 회귀신경망의 결정론적 모형과 확률론적 모형의 간극 해소

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 30
  • Download : 0
This thesis takes two approaches into consideration to deal with fluctuations in sequential patterns: the deterministic and probabilistic approaches. A deterministic Recurrent Neural Network (RNN) is first proposed using the predictive-coding framework, and the proposed model is tested in both simulation and robotic experiments. It is shown that when fluctuating sequential patterns are provided to the proposed model during learning process, the transient states are formed in the internal dynamic of the model along with the limit-cycle attractor. The deterministic model can deal with fluctuations in unknown test patterns by using the transient states during the error regression, which is a method based on the predictive-coding framework. However, the model's capability is limited when number of training patterns increases. The limitation is addressed by proposing a stochastic RNN based on the predictive-coding framework and recent advancements in variational Bayes. It is shown that the stochastic model can deal with fluctuating patterns in both simulation and robotic experiments by balancing between prediction errors and divergence between its prior and posterior models. We also consider the linking between deterministic and stochastic RNNs by shifting a network parameter, referred as a meta prior, during learning process of the stochastic model. It is observed that the network becomes more deterministic resulting in exact learning by a larger value of the meta prior, whereas it becomes more random resulting in losing details by a smaller meta prior. The generalization is maximized by an intermediate meta prior. We compare the performance of the deterministic and stochastic RNNs for dealing with fluctuating patterns in a robotic experiment where we increase number of training patterns compared with the first robotic experiment of the deterministic model. It is shown that the stochastic model outperforms the deterministic one significantly by using time-varying variance. So, we can overcome limitation of the deterministic model by introducing uncertainty in its hidden layers.
Advisors
Yoo, Hoi-Junresearcher유회준researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2019
Identifier
325007
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 전기및전자공학부, 2019.2,[v, 85 p. :]

Keywords

predictive coding▼arecurrent neural networks▼avariational Bayes▼agenerative model▼aerror regression; 예측 부호화▼a회귀신경망▼a변분 베이지안▼a생성 모델▼a오류 회귀

URI
http://hdl.handle.net/10203/265239
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=842228&flag=dissertation
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0