Show simple item record

dc.contributor.advisorPagilla, Prabhakar R
dc.creatorKhurana, Riya
dc.date.accessioned2021-05-12T19:57:26Z
dc.date.available2022-12-01T08:18:57Z
dc.date.created2020-12
dc.date.issued2020-12-07
dc.date.submittedDecember 2020
dc.identifier.urihttps://hdl.handle.net/1969.1/193041
dc.description.abstractHuman-Robot Collaboration is often required in unstructured, uncertain, and dynamic environments where automation is not ideal. Human operators are needed for their ability to perform complex tasks employing their situational awareness and decision-making capabilities. Collaboration between human and robot is necessary for achieving a higher level of safety and performance in such cases. An effective shared control system will provide humans an intuitive interface for robot control and provide intelligent assistance to improve overall task performance. In this work, we present a Human-Robot Collaborative Control framework for inspection and material handling tasks by employing computer vision and a general purpose joystick for providing human input to the robot remotely. In order to facilitate the human operator, an intuitive joystick control interface is developed allowing the operator to command the robot in the end-effector frame. This is combined with vision-based motion planning algorithms providing assistance to the operator to complete the task. Human operator controls the robot until the object is detected by vision node and then automatic control takes over. Hue, Saturation and Value (HSV) based OpenCV contour detection algorithms are used for object detection and pose estimation. ROS integrated open-source software, MoveIt has been utilized for motion planning algorithm. For joystick interface, we present a hybrid control law which allows human operators to provide orientation/torque reference in the world-frame and translation/force reference in the robot end-effector frame and automatic control takes care of underlying kinematics and joint level control. A physical platform and a simulation environment consisting of a six Degrees-Of-Freedom UR5 robot, a general purpose joystick, a USB camera, a vacuum gripper, force-torque sensor, and fixed speed Conveyor Belt are employed to develop and test the approach.en
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectCollaborative Control, Computer Vision, Motion Planning, Joystick Controlen
dc.titleHuman-Robot Collaborative Control for Inspection and Material Handling using Computer Vision and Joysticken
dc.typeThesisen
thesis.degree.departmentMechanical Engineeringen
thesis.degree.disciplineMechanical Engineeringen
thesis.degree.grantorTexas A&M Universityen
thesis.degree.nameMaster of Scienceen
thesis.degree.levelMastersen
dc.contributor.committeeMemberDarbha, Swaroop
dc.contributor.committeeMemberSong, Dezhen
dc.type.materialtexten
dc.date.updated2021-05-12T19:57:27Z
local.embargo.terms2022-12-01
local.etdauthor.orcid0000-0001-7751-4737


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record