@MISC{_on-linemotion, author = {}, title = {On-line motion blending for real-time locomotion generation By Sang Il Park*, Hyun Joon Shin, Tae Hoon Kim and Sung Yong Shin}, year = {} }
Share
OpenURL
Abstract
In this paper, we present an integrated framework of on-line motion blending for locomotion generation. We first provide a novel scheme for incremental timewarping, which always guarantees that the time goes forward. Combining the idea of motion blending with that of posture rearrangement, we introduce a motion transition graph to address on-line motion blending and transition simultaneously. Guided by a stream of motion specifications, our motion synthesis scheme moves from node to node in an on-line manner while blending a motion at a node and generating a transition motion at an edge. For smooth on-line motion transition, we also attach a set of example transition motions to an edge. To represent similar postures consistently, we exploit the inter-frame coherency embedded in the input motion specification. Finally, we provide a comprehensive solution to on-line motion retargeting by integrating existing techniques. Copyright # 2004 John Wiley & Sons, Ltd. KEY WORDS: computer animation; example-based motion synthesis; motion blending; locomotion generation