In this doctoral dissertation, a novel approach for the expressive editing of 3D articulated character motion is presented. The proposed approach solves for the motion given a set of projective constraints that relate the sketch inputs to the unknown 3D poses. This dissertation introduce the concept of sketch space, a contextual geometric representation of sketch targets ―motion properties that are editable via sketch input― that enhances, right on the viewport, different aspects of the motion. The combination of the proposed sketch targets and space allows for seamless editing of a wide range of properties, from simple joint trajectories to local parent-child spatiotemporal relationships and more abstract properties such as coordinated motions. This is made possible by interpreting the user’s input through a new sketch-based optimization engine in a uniform way. In addition, the proposed view-dependent sketch space also serves the purpose of disambiguating the user inputs by visualizing their range of effect and transparently defining the necessary constraints to set the temporal boundaries for the optimization.