Show simple item record

dc.creatorJones, Christopher Ven_US
dc.date.accessioned2013-02-22T20:41:22Z
dc.date.available2013-02-22T20:41:22Z
dc.date.created1999en_US
dc.date.issued2013-02-22
dc.identifier.urihttp://hdl.handle.net/1969.1/ETD-TAMU-1999-Fellows-Thesis-J65en_US
dc.descriptionDue to the character of the original source materials and the nature of batch digitization, quality control issues may be present in this document. Please report any quality issues you encounter to digital@library.tamu.edu, referencing the URI of the item.en_US
dc.descriptionIncludes bibliographical references (leaves 19-21).en_US
dc.description.abstractAutomatic motion planning has applications in many areas such as robotics, virtual reality systems, and computer-aided design. Although many different motion planning methods have been proposed, most are not used in practice since they are computationally infeasible except for some restricted cases, e.g., when the robot has very few degrees of freedom (dof). For this reason, attention has focussed on randomized or probabilistic motion planning methods. When many motion planning queries will be performed in the same environment, then it may be useful to pre-process the environment with the goal of decreasing the difficulty of the subsequent queries. Examples are the roadmap motion planning methods, which build a graph encoding representative feasible paths (usually in the robot's configuration space, which is the parametric spacer representing all possible positions and orientations of the robot in the workspace). Indeed, recently several probabilistic roadmap methods (PRMs) (including our group's obstacle-based PRM ) have been used to solve many difficult planning problems involving high-dimensional C-spaces that could not be solved before. However, if the start and goal configurations are known a priori, only one (or a very few) queries will be performed in a single environment, then it is generally not worthwhile to perform an expensive preprocessing stage, particularly if there are time constraints as in animation or virtual reality applications. In this case, a more directed search of the free configuration space is needed (e.g., as opposed to roadmap methods which are designed to try to cover the entire freespace). Motion planning methods that operate in this fashion are often called single shot methods. In our current work, we are developing an adaptive framework for single shot motion planning (i.e., planning without preprocessing). This framework can be used in any situation, and in particular, is suitable for crowded environments in which the robot's free C-space has narrow corridors. The main idea of our framework is that one should adaptively select a planner whose strengths match the current situation, and then switch to a different planner when circumstances change. This approach requires that we develop a set of planners, and characterize the strengths and weaknesses of each planner in such a way that we can easily select the best planner for the current situation. Our experimental results show that adaptive selection of different planning methods enables the algorithms to be used in a cooperative manner to successfully solve queries that none of them would be able to solve on their own.en_US
dc.format.mediumelectronicen_US
dc.format.mimetypeapplication/pdfen_US
dc.language.isoen_USen_US
dc.publisherTexas A&M Universityen_US
dc.rightsThis thesis was part of a retrospective digitization project authorized by the Texas A&M University Libraries in 2008. Copyright remains vested with the author(s). It is the user's responsibility to secure permission from the copyright holder(s) for re-use of the work beyond the provision of Fair Use.en_US
dc.subjectengineering I.en_US
dc.subjectMajor engineering I.en_US
dc.titleAn adaptive framework for 'single shot' motion planningen_US
thesis.degree.departmentengineering Ien_US
thesis.degree.disciplineengineering Ien_US
thesis.degree.nameFellows Thesisen_US
thesis.degree.levelUndergraduateen_US
dc.type.genrethesisen_US
dc.type.materialtexten_US
dc.format.digitalOriginreformatted digitalen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record