Deep learning has been a major research area these days. As the name implies, enormous amount of computation and memory are required to compute the deep neural network. Therefore how to manage the network structure is an essential issue of deep learning. One of the approach to deal with the problem is to prune the trained network. The latest researches benefit from the strong infl-uence of absolute value of weight on performance. In this work, a complementary method to the existing method is introduced. A new measure adopting the Hebbian learning is used and weighted summed to the absolute value of weight. Due to the complementary nature of the Hebbian learning, the proposed method showed an improvement in performance. Through the various experiments, small insights of what features the weight ‘important’ are given.