The full text of this item is not available at this time because the student has placed this item under an embargo for a period of time. The Libraries are not authorized to provide a copy of this work during the embargo period, even for Texas A&M users with NetID.
Intuitive Generation of Realistic Motions for Articulated Human Characters
dc.contributor.advisor | Chai, Jinxiang | |
dc.creator | Min, Jianyuan | |
dc.date.accessioned | 2013-10-02T21:27:05Z | |
dc.date.available | 2015-05-01T05:57:08Z | |
dc.date.created | 2013-05 | |
dc.date.issued | 2013-01-15 | |
dc.date.submitted | May 2013 | |
dc.identifier.uri | https://hdl.handle.net/1969.1/149245 | |
dc.description.abstract | A long-standing goal in computer graphics is to create and control realistic motion for virtual human characters. Despite the progress made over the last decade, it remains challenging to design a system that allows a random user to intuitively create and control life-like human motions. This dissertation focuses on exploring theory, algorithms and applications that enable novice users to quickly and easily create and control natural-looking motions, including both full-body movement and hand articulations, for human characters. More specifically, the goals of this research are: (1) to investigate generative statistical models and physics-based dynamic models to precisely predict how humans move and (2) to demonstrate the utility of our motion models in a wide range of applications including motion analysis, synthesis, editing and acquisition. We have developed two novel generative statistical models from prerecorded motion data and show their promising applications in real time motion editing, online motion control, offline animation design, and motion data processing. In addition, we have explored how to model subtle contact phenomena for dexterous hand grasping and manipulation using physics-based dynamic models. We show for the first time how to capture physically realistic hand manipulation data from ambiguous image data obtained by video cameras. | en |
dc.format.mimetype | application/pdf | |
dc.subject | Graphics | en |
dc.subject | Animation | en |
dc.subject | Human motion synthesis | en |
dc.subject | Motion generation | en |
dc.subject | Motion analysis | en |
dc.subject | Motion control | en |
dc.title | Intuitive Generation of Realistic Motions for Articulated Human Characters | en |
dc.type | Thesis | en |
thesis.degree.department | Computer Science and Engineering | en |
thesis.degree.discipline | Computer Science | en |
thesis.degree.grantor | Texas A&M University | en |
thesis.degree.name | Doctor of Philosophy | en |
thesis.degree.level | Doctoral | en |
dc.contributor.committeeMember | Keyser, John | |
dc.contributor.committeeMember | Schaefer, Scott | |
dc.contributor.committeeMember | Hurtado, John E. | |
dc.type.material | text | en |
dc.date.updated | 2013-10-02T21:27:05Z | |
local.embargo.terms | 2015-05-01 |
Files in this item
This item appears in the following Collection(s)
-
Electronic Theses, Dissertations, and Records of Study (2002– )
Texas A&M University Theses, Dissertations, and Records of Study (2002– )