Show simple item record

dc.contributor.advisorLee, Kiju
dc.creatorJarecki, Annalisa Joy
dc.date.accessioned2023-09-19T19:08:56Z
dc.date.created2023-05
dc.date.issued2023-05-03
dc.date.submittedMay 2023
dc.identifier.urihttps://hdl.handle.net/1969.1/199180
dc.description.abstractThe primary goal of this thesis project is to develop a robotic and software platform for agricultural observation purposes within a multi-robot system using a mixed reality user interface. The presented hardware consists of a small unmanned ground vehicle (UGV) equipped with a robotic arm adapted for these purposes with an attached depth camera. The hardware is controlled by a RaspberryPi4 (RPi4) single board computer (SBC), distributing signals to various motor controllers on the platform and receiving input from system sensors. The user provides input to the system using a Microsoft HoloLens head mounted display (HMD) and gesture control, which is relayed to the RPi4. The user interface (UI) was designed in Unity 3D and uses ROS# for Web-Socket communication on the HoloLens side. The system control was developed in Python and ROS2 using roslibpy and the rosbridge_server for WebSocket communication on the RPi4 side. An inverse kinematic joint solver using the Netwon-Raphson root-finding method for convergence and a trajectory generation solver using quadratic splines were designed in Python and implemented on the RPi4. The resulting solved joint angles are sent over USB serial communication to the servo motor controller for the robotic arm and sent back to the HoloLens over WebSocket communication. Within the HoloLens UI, a holographic robotic arm is displayed for the user alongside the live camera feed to display the current position of the robotic arm in real-time, even when the physical platform is not visible. Furthermore, this thesis has laid the groundwork for future investigation into the benefits of utilizing a mixed reality (MR) UI. A user study has been planned to compare the performance results to that of controlling the system with a standard computer interface. The users will navigate the system around an unfamiliar environment to take pictures of various targets using each interface. The measured times and quantitatively assessed image quality will be used to compare the performance of each system and draw conclusions regarding MR UI usage in robotic system control. While the Institutional Review Board (IRB) approval was not available at the time to test the system, further work will be completed beyond this thesis for future publication.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectagricultural robotics
dc.subjectHoloLens
dc.subjectrobot control
dc.subjectrobotic observation
dc.titleA Novel Adaptable Mixed Reality User Interface for Robotic Control in Agricultural Applications
dc.typeThesis
thesis.degree.departmentMechanical Engineering
thesis.degree.disciplineMechanical Engineering
thesis.degree.grantorTexas A&M University
thesis.degree.nameMaster of Science
thesis.degree.levelMasters
dc.contributor.committeeMemberDarbha, Swaroop
dc.contributor.committeeMemberQuek, Francis
dc.type.materialtext
dc.date.updated2023-09-19T19:08:57Z
local.embargo.terms2025-05-01
local.embargo.lift2025-05-01
local.etdauthor.orcid0009-0009-7882-0600


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record