The error-backpropagation (EBP) algorithm for learning multilayer perceptrons (MLPs) is known to have good features of robustness and economical efficiency. However, the algorithm has difficulty in selecting an optimal constant learning rate and thus results in non-optimal learning speed and inflexible operation for working data. This paper introduces an elastic learning rate that guarantees convergence of learning and its local realization by online update of MLP parameters into the original EBP algorithm in order to complement the non-optimality. The results of experiments on a speaker verification system with Korean speech database are presented and discussed to demonstrate the performance improvement of the proposed method in terms of learning speed and flexibility for working data of the original EBP algorithm.