Rig Animation with a Tangible and Modular Input Device

Authors: Glauser Oliver, Wan-Chun Ma, Daniele Panozzo, Alec Jacobson, O. Hilliges, O. Sorkine
publication: To appear in Proceedings ACM SIGGRAPH, Anaheim, CA, USA, Jul 2016

Teaser-Picture

Taking a rigged 3D character with many degrees of freedom as input, we propose a method to automatically compute assembly instructions for a modular tangible controller, consisting only of a small set of joints. A novel hardware joint parametrization provides a user-experience akin to inverse kinematics. After assembly the device is bound to the rig and enables animators to traverse a large space of poses via fluid manipulations. Here we control 110 bones in the dragon character with only 8 physical joints and 2 splitters. Detailed pose nuances are preserved by a real time pose interpolation strategy.



Abstract

We propose a novel approach to digital character animation, combining the benefits of tangible input devices and sophisticated rig animation algorithms. A symbiotic software and hardware approach facilitates the animation process for novice and expert users alike. We overcome limitations inherent to all previous tangible devices by allowing users to directly control complex rigs using only a small set (5-10) of physical controls. This avoids oversimplification of the pose space and excessively bulky device configurations. Our algorithm derives a small device configuration from complex character rigs, often containing hundreds of degrees of freedom, and a set of sparse sample poses. Importantly, only the most influential degrees of freedom are controlled directly, yet detailed motion is preserved based on a pose interpolation technique. We designed a modular collection of joints and splitters, which can be assembled to represent a wide variety of skeletons. Each joint piece combines a universal joint and two twisting elements, allowing to accurately sense its configuration. The mechanical design provides a smooth inverse kinematics-like user experience and is not prone to gimbal locking. We integrate our method with the professional 3D software Autodesk Maya® and discuss a variety of results created with characters available online. Comparative user experiments show significant improvements over the closest state-of-the-art in terms of accuracy and time in a keyframe posing task.



Video



System overview

Sys-Overview-Picture

Illustration of our pipeline from input character to fluid tangible animation using an optimized device configuration. The horse has 29 bones, controlled by 8 joints.



Citation

@article{Glauser:2016:ATMID,
	author = {Glauser, Oliver and Ma, Wan-Chun and Panozzo, Daniele and Jacobson, Alec and Hilliges, Otmar and Sorkine-Hornung, Olga},
	title = {{Rig Animation with a Tangible and Modular Input Device}},
	journal = {ACM Transactions on Graphics (Proceedings of ACM SIGGRAPH)},
	year = {2016},
}

Gallery


Gallery-Picture

Depending on the available kit, device build instruction plans with different complexity are generated by our algorithm. Note that the models have much higher degrees of freedom than the generated control structures. The inputs were (nr. bones/nr. sample poses): Horse: (29/25 galloping, going up) – Dragon: (110/12 flying, some walking); Scorpion (62/20 walking, attacking); Dancer (22/6). Note that the device for the Dancer is asymmetric due to the asymmetry in the input poses: the left arm of the character moves almost rigidly with the torso and it is thus not necessary to have any joint controlling the left arm.


Acknowledgments

We are grateful to Cédric Pradalier and Evgeni Sorkine for invalu- able discussions and engineering support, to Sebastian Schoellham- mer for his assistance on 3D modeling and rigging in Maya, to Olga Diamanti for composing the accompanying video, to Cécile Edwards-Rietmann for narrating it and to Jeannine Wymann for her help in assembling the prototypes. We also thank our user study participants. This work was supported in part by the SNF grant 200021_162958 and the ERC grant iModel (StG-2012-306877). Alec Jacobson is funded in part by NSF grants IIS-14-09286 and IIS-17257.