Chest Pain Detection in YouTube Videos
Abstract
The topic of this research is to detect chest pain action in YouTube videos. Chest pain detection
is very important in smart home applications. However, chest pain detection in YouTube videos is
very challenging due to the dissimilarities between YouTube videos and the training set.
In this research, we implemented 5 promising network architectures for chest pain detection
and compared their performance. We proposed both frame detectors based on a single frame
and clip detectors based on a sequence of frames. Both human skeleton data, as well as RGB
information, were extracted as the input feature of the models. We adopted a wide range of network
architectures for detection, such as Inception Resnet, simple feed-forward network, RNN, faster
RCNN, and I3D. The proposed network architectures were trained on NTU RGB+D which is a
clip-wise-labeled dataset containing a wide range of human actions, including chest pain. We
implemented APIs of our detectors that feed the input videos to our trained models and visualize
the inference results by drawing bounding boxes and confidence scores directly on the input videos.
The performance of the detectors was evaluated on both the labeled dataset and the challenging
YouTube videos, and promising results were obtained. In the end, we explored the temporal action
localization architectures and discussed their viability to be trained on the current dataset.
Citation
Lu, Yipeng (2021). Chest Pain Detection in YouTube Videos. Master's thesis, Texas A&M University. Available electronically from https : / /hdl .handle .net /1969 .1 /196099.