Show simple item record

dc.contributor.advisorJafari, Roozbeh
dc.creatorWu, Jian
dc.date.accessioned2019-01-23T17:34:36Z
dc.date.available2019-01-23T17:34:36Z
dc.date.created2018-12
dc.date.issued2018-11-20
dc.date.submittedDecember 2018
dc.identifier.urihttps://hdl.handle.net/1969.1/174385
dc.description.abstractActivity and gesture recognition using wearable motion sensors, also known as inertial measurement units (IMUs), provides important context for many ubiquitous sensing applications including healthcare monitoring, human computer interface and context-aware smart homes and offices. Such systems are gaining popularity due to their minimal cost and ability to provide sensing functionality at any time and place. However, several factors can affect the system performance such as sensor location and orientation displacement, activity and gesture inconsistency, movement speed variation and lack of tiny motion information. This research is focused on developing signal processing solutions to ensure the system robustness with respect to these factors. Firstly, for existing systems which have already been designed to work with certain sensor orientation/location, this research proposes opportunistic calibration algorithms leveraging camera information from the environment to ensure the system performs correctly despite location or orientation displacement of the sensors. The calibration algorithms do not require extra effort from the users and the calibration is done seamlessly when the users present in front of an environmental camera and perform arbitrary movements. Secondly, an orientation independent and speed independent approach is proposed and studied by exploring a novel orientation independent feature set and by intelligently selecting only the relevant and consistent portions of various activities and gestures. Thirdly, in order to address the challenge that the IMU is not able capture tiny motion which is important to some applications, a sensor fusion framework is proposed to fuse the complementary sensor modality in order to enhance the system performance and robustness. For example, American Sign Language has a large vocabulary of signs and a recognition system solely based on IMU sensors would not perform very well. In order to demonstrate the feasibility of sensor fusion techniques, a robust real-time American Sign Language recognition approach is developed using wrist worn IMU and surface electromyography (EMG) sensors.en
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectrobust signal processingen
dc.subjectinertial measurement uniten
dc.subjectactivity/gesture recognitionen
dc.subjectorientation independenten
dc.subjectsensor fusionen
dc.subjectsensor orientation calibrationen
dc.subjectsensor location calibrationen
dc.titleRobust Signal Processing Techniques for Wearable Inertial Measurement Unit (IMU) Sensorsen
dc.typeThesisen
thesis.degree.departmentComputer Science and Engineeringen
thesis.degree.disciplineComputer Engineeringen
thesis.degree.grantorTexas A & M Universityen
thesis.degree.nameDoctor of Philosophyen
thesis.degree.levelDoctoralen
dc.contributor.committeeMemberHammond, Tracy
dc.contributor.committeeMemberPark, Sung Il
dc.contributor.committeeMemberStoleru, Radu
dc.type.materialtexten
dc.date.updated2019-01-23T17:34:36Z
local.etdauthor.orcid0000-0001-6150-9693


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record