Show simple item record

dc.creatorAnanthanarayan, Aashish
dc.date.accessioned2020-07-22T19:30:04Z
dc.date.available2020-07-22T19:30:04Z
dc.date.created2021-05
dc.date.submittedMay 2021
dc.identifier.urihttps://hdl.handle.net/1969.1/188381
dc.description.abstractNeural Networks play an important role in real-time object detection. Several types of networks are being developed in order to perform such detections at a faster pace. One such neural network that can prove useful is the YOLO network. Built to perform real-time detection, YOLO offers great speeds for simple detections. The goal of our research is to see how YOLO would work with body language. Would it be fast enough? And how accurate would it be? Compared to other forms of object detection, body-language detection is more vague. There are several factors to be accounted for. This is why we first begin by talking about hand recognition and gesture recognition, and then move onto body language. This research aims at understanding how YOLO would perform when subject to several tests by using its implementations, building datasets, training and testing the models to see whether it is successful in detecting hand gestures and body language.en
dc.format.mimetypeapplication/pdf
dc.subjectNeural networksen
dc.subjectobject detectionen
dc.subjectdeep learningen
dc.subjectYOLOen
dc.subjectbody language recognitionen
dc.subjecten
dc.titleHand Detection and Body Language Recognition Using YOLOen
dc.typeThesisen
thesis.degree.disciplineComputer Scienceen
thesis.degree.grantorUndergraduate Research Scholars Programen
thesis.degree.nameB.S.en
thesis.degree.levelUndergraduateen
dc.contributor.committeeMemberJiang, Anxiao
dc.type.materialtexten
dc.date.updated2020-07-22T19:30:04Z


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record