Jianyuan Min

Synthesis and Editing of Personalized Stylistic
Human Motion

ACM SIGGRAPH Symposium on Interative 3D Graphics and Games, 2010

[download paper] [download video]
This paper presents a generative human motion model for synthesis, retargeting, and editing of personalized human motion styles. We first record a human motion database from multiple actors performing a wide variety of motion styles for particular actions. We then apply multilinear analysis techniques to construct a generative motion model of the form x = g(a; e) for particular human actions, where the parameters a and e control "identity" and "style" variations of the motion x respectively. The new modular representation naturally supports motion generalization to new actors and/or styles. We demonstrate the power and flexibility of the multilinear motion models by synthesizing personalized stylistic human motion and transferring the stylistic motions from one actor to another. We also show the effectiveness of our model by editing stylistic motion in style and/or identity space.