An Efficient Fusion Scheme for Human Hand Trajectory Reconstruction Using Inertial Measurement Unit and Kinect Camera
Abstract
The turn of 21st century has witnessed an evolving trend in wearable devices research and improvements in human-computer interfaces. In such systems, position information of human hands in 3-D space has become extremely important as various applications require knowledge of user’s hand position. A promising example of which is a wearable ring that can naturally and ubiquitously reconstruct handwriting based on motion of human hand in an indoor environment. A common approach is to exploit the portability and affordability of commercially available inertial measurement units (IMU). However, these IMUs suffer from drift errors accumulated by double integration of acceleration readings. This process accrues intrinsic errors coming from sensor’s sensitivity, factory bias, thermal noise, etc., which result in large deviation from position’s ground truth over time. Other approaches utilize optical sensors for better position estimation, but these sensors suffer from occlusion and environment lighting conditions. In this thesis, we first present techniques to calibrate IMU, minimizing undesired effects of intrinsic imperfection resided within cheap MEMS sensors. We then introduce a Kalman filter-based fusion scheme incorporating data collected from IMU and Kinect camera, which is shown to overcome each sensor’s disadvantages and improve the overall quality of reconstructed trajectory of human hands.
Subject
wearabletrajectory reconstruction
inertial measurement unit
MEMS
IMU
Kinect
Human Computer Interaction
HCI
sensor fusion
Kalman filter
Citation
Le, Trung (2017). An Efficient Fusion Scheme for Human Hand Trajectory Reconstruction Using Inertial Measurement Unit and Kinect Camera. Undergraduate Research Scholars Program. Available electronically from https : / /hdl .handle .net /1969 .1 /177589.