For an artificial neural network to perform well, it is necessary to design an appropriate internal hierarchy, which requires a lot of time and expertise. Therefore, a method to automate the neural network structure design has been proposed, and a structure that can perform better than the human structure has been found. However, this method has a problem that takes a vast computational resource, because of huge search space and repetitive training. In this thesis, we propose a method to search the neural network efficiently by constructing the search range systematically through network transformation and Bayesian optimization method. Our method can finds a better convolution neural network architecture than the other methods under limited number of network evaluation conditions.