Real-time animation of human-like characters is an active research area in computer graphics. The conventional approaches have, however, hardly dealt with the rhythm and energy of motions, which are essential in handling and generating rhythmic motions of which the styles match those of background sound signals. In this thesis, we present a novel scheme for synthesizing a new rhythmic motion for an input sound signal from unlabelled example motions. Our scheme first captures the motion beats and variations of music energy from the example motions and their accompanying sound signals. We extract the basic movements and their transitions, and then train their music-energy observation probabilities. By using those data, our scheme then constructs a movement transition graph that represents the example motions. Given an input sound signal, our scheme finally synthesizes a novel motion in an on-line manner while traversing the motion transition graph, which is synchronized with the input sound signal and satisfies kinematic constraints given explicitly and implicitly. The resulting motion also has a proper style that matches that of the sound signal. Through experiments, we have demonstrated that our scheme can effectively produce a variety of rhythmic motions.