An Intelligent Human-Tracking Robot Based-on Kinect Sensor
MetadataShow full item record
This thesis provides an indoor human-tracking robot, which is also able to control other electrical devices for the user. The overall experimental setup consists of a skid-steered mobile robot, Kinect sensor, laptop, wide-angle camera and two lamps. The Kinect sensor is mounted on the mobile robot to collect position and skeleton data of the user in real time and sends it to the laptop. The laptop processes these data and then sends commands to the robot and the lamps. The wide-angle camera is mounted on the ceiling to verify the tracking performance of the Kinect sensor. A C++ program runs the camera, and a java program is used to process the data from the C++ program and the Kinect sensor and then sends the commands to the robot and the lamps. The human-tracking capability is realized by two decoupled feedback controllers for linear and rotational motions. Experimental results show that although there are small delays (0.5 s for linear motion and 1.5 s for rotational motion) and steady-state errors (0.1 m for linear motion and 1.5° for rotational motion), tests show that they are acceptable since the delays and errors do not cause the tracking distance or angle out of the desirable range (±0.05m and ± 10° of the reference input) and the tracking algorithm is robust. There are four gestures designed for the user to control the robot, two switch-mode gestures, lamp crate gesture, and lamp-selection and color change gesture. Success rates of gestures recognition are more than 90% within the detectable range of the Kinect sensor.
Chen, Jinfa (2015). An Intelligent Human-Tracking Robot Based-on Kinect Sensor. Master's thesis, Texas A & M University. Available electronically from