Within the class of a perceptual user interface(PUI) that is providing the computer with perceptive capabilities, computer vision is being exploited more and more as a new modality or in replacement of standard interaction paradigms. One step towards the realization of PUIs is the creation of VBI (View-Based Interface) projects, which consist of several image processing algorithms to handle user tracking, face tracking/pose estimation, 3D articulated body tracking, and appearance-based gesture recognition. In this viewpoint, we have studied an on-line static and dynamic gesture recognition technique using several image processing and recognition modeling algorithms.
In the previous works, the main target of static gesture recognition is hand posture recognition based on sign language processing for blind people. However, in this study, we develop the arm/body postures recognition techniques, why most general people use arm/body postures more than hand postures when talking each other.
In this dissertation, we have conducted the following three phases for the recognition of static gestures. The first step is defined a spotting algorithm to detect the start and end position from a series of natural arm/body motions. The second step is an effective feature extraction method for recognizing various gestures. The third step is to make recognizing model based on knowledge-based system.
In the preprocessing stage for dynamic gesture signals, our approach consists of three procedures: hand localization, hand tracking, and gesture spotting. The hand location procedure detects hand candidate regions on the basis of skin-color and hand motion. The hand tracking algorithm finds the centroids of the moving hand regions, connects them, and produces hand trajectory coordinates. Finally, the gesture spotting algorithm divides the trajectory into real and meaningless segments.
Many dynamic gesture recognition methods have been proposed: syntactical analysis, neural netw...