Show simple item record

dc.contributor.advisorHammond, Tracy
dc.contributor.advisorMaldonado, Theresa
dc.creatorCherian, Josh A
dc.date.accessioned2017-08-21T14:47:10Z
dc.date.available2017-08-21T14:47:10Z
dc.date.created2017-05
dc.date.issued2017-05-08
dc.date.submittedMay 2017
dc.identifier.urihttps://hdl.handle.net/1969.1/161664
dc.description.abstractOver the past several years, the use of wearable devices has increased dramatically, primarily for fitness monitoring, largely due to their greater sensor reliability, increased functionality, smaller size, increased ease of use, and greater affordability. These devices have helped many people of all ages live healthier lives and achieve their personal fitness goals, as they are able to see quantifiable and graphical results of their efforts every step of the way (i.e. in real-time). Yet, while these device systems work well within the fitness domain, they have yet to achieve a convincing level of functionality in the larger domain of healthcare. As an example, according to the Alzheimer’s Association, there are currently approximately 5.5 million Americans with Alzheimer’s Disease and approximately 5.3 million of them are over the age of 65, comprising 10% of this age group in the U.S. The economic toll of this disease is estimated to be around $259 billion. By 2050 the number of Americans with Alzheimer’s disease is predicted to reach around 16 million with an economic toll of over $1 trillion. There are other prevalent and chronic health conditions that are critically important to monitor, such as diabetes, complications from obesity, congestive heart failure, and chronic obstructive pulmonary disease (COPD) among others. The goal of this research is to explore and develop accurate and quantifiable sensing and machine learning techniques for eventual real-time health monitoring by wearable device systems. To that end, a two-tier recognition system is presented that is designed to identify health activities in a naturalistic setting based on accelerometer data of common activities. In Tier I a traditional activity recognition approach is employed to classify short windows of data, while in Tier II these classified windows are grouped to identify instances of a specific activity. Everyday activities that were explored in this research include brushing one’s teeth, combing one’s hair, scratching one’s chin, washing one’s hands, taking medication, and drinking. Results show that an F-measure of 0.83 is achievable when identifying these activities from each other and an F-measure of 0.82 is achievable when identifying instances of brushing teeth over the course of a day.en
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectActivity Recognitionen
dc.subjectBrushing Teethen
dc.subjectMachine Learningen
dc.subjectHealth Monitoringen
dc.subjectWearable Devicesen
dc.titleRecognition of Everyday Activities through Wearable Sensors and Machine Learningen
dc.typeThesisen
thesis.degree.departmentElectrical and Computer Engineeringen
thesis.degree.disciplineElectrical Engineeringen
thesis.degree.grantorTexas A & M Universityen
thesis.degree.nameMaster of Scienceen
thesis.degree.levelMastersen
dc.contributor.committeeMemberGoldberg, Daniel
dc.contributor.committeeMemberGeorghiades, Costas
dc.type.materialtexten
dc.date.updated2017-08-21T14:47:10Z
local.etdauthor.orcid0000-0002-7749-2109


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record