Dressing is one of the most common activities in human society. Perfecting the skill of dressing can take an average child three to four years of daily practice. The challenge is primarily due to the combined difficulty of coordinating different body parts and manip- ulating soft and deformable objects (clothes). We present a tech- nique to synthesize human dressing by controlling a human char- acter to put on an article of simulated clothing. We identify a set of primitive actions which account for the vast majority of motions observed in human dressing. These primitive actions can be as- sembled into a variety of motion sequences for dressing different garments with different styles. Exploiting both feed-forward and feedback control mechanisms, we develop a dressing controller to handle each of the primitive actions. The controller plans a path to achieve the action goal while making constant adjustments locally based on the current state of the simulated cloth when necessary. We demonstrate that our framework is versatile and able to animate dressing with different clothing types including a jacket, a pair of shorts, a robe, and a vest. Our controller is also robust to differ- ent cloth mesh resolutions which can cause the cloth simulator to generate significantly different cloth motions. In addition, we show that the same controller can be extended to assistive dressing.
A character puts on a jacket.
This material is based in part upon work supported by the National Science Foundation under grants XYZ. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Go to Greg Turk's Home Page.