The main characteristics of human hand gestures ran be summarized by their dynamic, multiattribute property. To utilize hand gestures as a Hay of interaction, it is necessary to analyze the motion patterns for each of the gesture attributes and finally to extract the whole interpretation by integrating the relevant factors across time. Previous research have shown the possibility for recognition of local aspects of hand gesture. But the global framework for finding the whole interpretation from the local aspects has Set to be provided. In this article, we propose a colored Petri net model for high-level description of hand gestures, This model intercommunicates with simultaneous low-level recognizers and thus finds a whole-interpretation for the gesture.