Regularization is one of the important things when we deal with deep neural networks due to their highly flexibilities to fit the large scale data with huge number of parameters in various domain. Dropout is one of the most popular regularization techniques often used in this field. Bayesian neural networks have theoretically grounded benefits of regularization in the perspective of marginalizing the distribution of parameters instead of model averaging. But the intractable integrations and sampling costs are obstacles in the Bayesian approach.
However, recent studies proposed probabilistic based efficient variational inference methods. In this paper, we propose a dropout neural network using Bayesian approach which learns dropout rates adaptively to the data. Also we propose alternating algorithm using empirical Bayes recursive scheme, which enforce to learn hyper-parameters from the validation data. The algorithm avoids over-fitting to the training data and improves performance further. We report the comparisons of the performances to other dropout methods.