Show simple item record

dc.contributor.advisorKim, Won-Jong
dc.creatorKang, Kyeongmo
dc.date.accessioned2023-10-12T14:04:49Z
dc.date.created2023-08
dc.date.issued2023-05-15
dc.date.submittedAugust 2023
dc.identifier.urihttps://hdl.handle.net/1969.1/199893
dc.description.abstractToday, security is one of the most important concerns everywhere, from individuals to the military. Various security devices were developed and used to improve the security level. One of the popular security devices is a surveillance camera. However, most surveillance cameras are mounted on the ceiling, which causes blind spots and limits recognizing people. This thesis pro-poses an indoor autonomous mobile surveillance robot that can compensate for the disadvantages of fixed cameras and improve the security level. The surveillance robot was developed based on Robot Operating System (ROS) and a Kinect sensor. Its control system was designed with proportional-integral (PI) and proportional-integral-derivative (PID) controllers based on its dynamic model. As key functions of the robot, Simultaneous Localization and Mapping (SLAM), autonomous navigation, face recognition, and human tracking were implemented in the robot based on Gmapping, Real-Time Appearance-Based Map-ping (RTAB-Map), ROS navigation, OpenNI2/NiTE2, and Dlib’s face recognition. In this thesis, experiments on velocity control, SLAM, autonomous navigation, face recognition, and human tracking were conducted. The velocity control system controlled the linear and angular velocity with a small oscillation and no steady-state error. In SLAM, the robot success-fully created indoor maps using 2D and 3D SLAM, which were accurate enough to be used for autonomous navigation. In autonomous navigation, the robot maintained the designated safety distance of 0.32 m or more from obstacles and moved toward the destination along the planned paths. The final position and orientation error averages were 0.21 m and 0.14 rad. The robot’s face recognition recognized a human face at a distance of 1.2–5.5 m. The robot’s human tracking reliably detected a target at a distance of 1.0–4.2 m and maintained the desired position without tracking loss.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectAutonomous navigation
dc.subjectDigital control
dc.subjectFace recognition
dc.subjectHuman tracking
dc.subjectRGB-D sensor
dc.subjectROS
dc.subjectSLAM
dc.subjectSurveillance robot
dc.titleIndoor Autonomous Mobile Surveillance Robot Based on Robot Operating System and Kinect Sensor
dc.typeThesis
thesis.degree.departmentMechanical Engineering
thesis.degree.disciplineMechanical Engineering
thesis.degree.grantorTexas A&M University
thesis.degree.nameMaster of Science
thesis.degree.levelMasters
dc.contributor.committeeMemberRathinam, Sivakumar
dc.contributor.committeeMemberSong, Dezhen
dc.type.materialtext
dc.date.updated2023-10-12T14:04:49Z
local.embargo.terms2025-08-01
local.embargo.lift2025-08-01
local.etdauthor.orcid0009-0008-2233-8340


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record