Show simple item record

dc.contributor.advisorHammond, Tracy
dc.creatorKoh, Jung In
dc.date.accessioned2018-02-05T16:47:29Z
dc.date.available2019-08-01T06:53:50Z
dc.date.created2017-08
dc.date.issued2017-07-27
dc.date.submittedAugust 2017
dc.identifier.urihttps://hdl.handle.net/1969.1/165675
dc.description.abstractRecent trends in computer-mediated communications (CMC) have not only led to expanded instant messaging through the use of images and videos, but have also expanded traditional text messaging with richer content, so-called visual communication markers (VCM) such as emoticons, emojis, and stickers. VCMs could prevent a potential loss of subtle emotional conversation in CMC, which is delivered by nonverbal cues that convey affective and emotional information. However, as the number of VCMs grows in the selection set, the problem of VCM entry needs to be addressed. Additionally, conventional ways for accessing VCMs continues to rely on input entry methods that are not directly and intimately tied to expressive nonverbal cues. One such form of expressive nonverbal that does exist and is well-studied comes in the form of hand gestures. In this work, I propose a user-defined hand gesture set that is highly representative to VCMs and a two-stage hand gesture recognition system (trajectory-based, shape-based) that distinguishes the user-defined hand gestures. While the trajectory-based recognizer distinguishes gestures based on the movements of hands, the shape-based recognizer classifies gestures based on the shapes of hands. The goal of this research is to allow users to be more immersed, natural, and quick in generating VCMs through gestures. The idea is for users to maintain the lower-bandwidth online communication of text messaging to largely retain its convenient and discreet properties, while also incorporating the advantages of higher-bandwidth online communication of video messaging by having users naturally gesture their emotions that are then closely mapped to VCMs. Results show that the accuracy of user-dependent is approximately 86% and the accuracy of user-independent is about 82%.en
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectHand gesture recognitionen
dc.subjectCMCen
dc.subjectemojisen
dc.titleDeveloping a Hand Gesture Recognition System for Mapping Symbolic Hand Gestures to Analogous Emoji in Computer-mediated Communicationen
dc.typeThesisen
thesis.degree.departmentComputer Science and Engineeringen
thesis.degree.disciplineComputer Scienceen
thesis.degree.grantorTexas A & M Universityen
thesis.degree.nameMaster of Scienceen
thesis.degree.levelMastersen
dc.contributor.committeeMemberChoe, Yoonsuck
dc.contributor.committeeMemberWallis, Cara
dc.type.materialtexten
dc.date.updated2018-02-05T16:47:30Z
local.embargo.terms2019-08-01
local.etdauthor.orcid0000-0002-3909-0192


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record