In character animation, automated synthesis of human motion is one of the important problems. However, while the domain of walking has made great progress, generating an interaction motion with an object remains a challenge because the types and shapes of the objects are very diverse, and the motion space in a complex environment is high dimensional and nonlinear. This doctoral dissertation studies two approaches of automatically generating multi-contact interaction motion with an object. First, when given motion data, a retargeting method is introduced that generates a new motion by applying the geometric relationship between the given character and object to other objects of similar shape. Next, we present a method of synthesizing an interaction motion by planning important intermediate key-poses that make multi-contact without using any motion data. In particular, we use geometric features, such as relative distance and orientation, as the relationship between a character and an object, and the motion is generated by computing trajectories of joints in a complex environment. Thus, we can synthesize an interaction motion in real-time, and we believe that these methods will improve the production of animation contents.