MiG '16:
Proceedings of the 2016 ACM SIGGRAPH Conference on Motion in Games.
Abstract
We present novel techniques for interactively editing the motion of
an animated character by gesturing with a mobile device. Our approach
is based on the notion that humans are generally able to convey motion
using simple and abstract mappings from their own movement to that
of an animated character. We first explore the feasibility of extracting
robust sensor data with sufficiently rich features and low noise, such that
the signal is predictably representative of a user's illustrative manipulation
of the mobile device. In particular, we find that the linear velocity and
device orientation computed from the motion sensor data are well-suited to
the task of interactive character control. We show that these signals can be
used for two different methods of interactively editing the locomotion of an
animated human figure: discrete gestures for editing single motions, and
continuous gestures for editing ongoing motions. We illustrate these techniques
using various types of motion edits which affect jumps, strides and turning.