The conventional neural networks are built without considering the underlying structure of the problems. Hence, they usually contain redundant weights and require excessive training time. A novel neural network structure is proposed for symmetric problems, which alleviate some of the aforementioned drawback of the conventional neural networks. This concept is expanded to that of the constrained neural network which may be applied to general structured problems. Because these neural networks can not be trained by the conventional training algorithm, which destroys the weight structure of the neural networks, a proper training algorithm is suggested. The illustrative examples are shown to demonstrate the applicability of the proposed idea.