There has been a great increase in performance of deep neural networks. However, for mobile devices which are not equipped with GPU (Graphics Processing Unit) or powerful CPU (Central Processing Unit), it is still impossible to deal with such a large amount of data in real time. In this paper, preliminary results in spike neural encoding methods reducing the amount of the input and computational load by mimicking the neuronal firing are presented. For this, two neuron models, leaky integrate-and-fire (LIF) model and simplified IF model, are exploited for transforming the input image to the spike image. For the evaluation, MNIST datasets are encoded and tested in deep neural networks for checking the loss of information. The proposed spike encoding modules using neuron models will be able to greatly help reduce required computation by using spike input data in low powered mobile devices