This thesis takes two approaches into consideration to deal with fluctuations in sequential patterns: the deterministic and probabilistic approaches. A deterministic Recurrent Neural Network (RNN) is first proposed using the predictive-coding framework, and the proposed model is tested in both simulation and robotic experiments. It is shown that when fluctuating sequential patterns are provided to the proposed model during learning process, the transient states are formed in the internal dynamic of the model along with the limit-cycle attractor. The deterministic model can deal with fluctuations in unknown test patterns by using the transient states during the error regression, which is a method based on the predictive-coding framework. However, the model's capability is limited when number of training patterns increases.
The limitation is addressed by proposing a stochastic RNN based on the predictive-coding framework and recent advancements in variational Bayes. It is shown that the stochastic model can deal with fluctuating patterns in both simulation and robotic experiments by balancing between prediction errors and divergence between its prior and posterior models. We also consider the linking between deterministic and stochastic RNNs by shifting a network parameter, referred as a meta prior, during learning process of the stochastic model. It is observed that the network becomes more deterministic resulting in exact learning by a larger value of the meta prior, whereas it becomes more random resulting in losing details by a smaller meta prior. The generalization is maximized by an intermediate meta prior. We compare the performance of the deterministic and stochastic RNNs for dealing with fluctuating patterns in a robotic experiment where we increase number of training patterns compared with the first robotic experiment of the deterministic model. It is shown that the stochastic model outperforms the deterministic one significantly by using time-varying variance. So, we can overcome limitation of the deterministic model by introducing uncertainty in its hidden layers.