Show simple item record

dc.contributor.advisorBehzadan, Amir H
dc.creatorSakib, Md Nazmus
dc.date.accessioned2020-09-11T15:43:12Z
dc.date.available2021-12-01T08:43:50Z
dc.date.created2019-12
dc.date.issued2019-11-19
dc.date.submittedDecember 2019
dc.identifier.urihttps://hdl.handle.net/1969.1/189173
dc.description.abstractUnmanned aerial vehicles (UAVs) or drones are being increasing used in many fields including the construction domain for a variety of applications such as surveying, inspection, progress monitoring, surveillance, safety management, and mapping. While drones can add value by lowering the cost and improving the accuracy of data collection, flying drones in congested and constantly evolving environments could also be a precursor to accidents and injuries to site personnel and physical properties, making it necessary to properly train drone operators prior to deploying drones in the field. Virtual reality (VR) simulation has been used for many years as an alternative to real-world training, with past research primarily focused on collision detection and accident prevention techniques with the goal of preventing equipment-equipment and equipment-worker contact collisions. There has also been limited work on understanding, quantifying, and comparing the physiological state of heavy equipment operators. The central hypothesis of this work is that, for similar task complexity a drone operator’s physiological state during a VR flight experiment closely resembles his/her physiological state in a real-world flight scenario. To prove this hypothesis, a methodology is developed for collecting, annotating, and assessing drone operator’s physiological data using wearable devices in both real and virtual environments to determine and compare physiological states that can potentially lead to operators’ errors. In this research, different levels of task complexity are simulated in VR and replicated in an outdoor (OD) environment to collect participants’ data and analyze variations in physiological readings. Statistical analysis between VR and OD sessions shows no significant difference in participants’ physiological features (e.g., mean SCL, mean skin temperature, mean HR, RMSSD) and self-reported scores (e.g., NASA TLX, CARMA video self-feedback). This indicates that participants had similar experiences (as described by physiological state) while performing under same levels of task complexity in OD and simulated VR. Machine learning task prediction, performance prediction, and stress prediction models are also introduced that are trained on all physiological features and self-reported data. The task prediction model has an accuracy of 75%, and the performance and stress prediction models have prediction errors (i.e., RMSE) of 1.1421 and 0.7578, respectively.en
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectDrone operationen
dc.subjectPhysiological stateen
dc.subjectPerformance assessmenten
dc.subjectMachine learningen
dc.subjectWearable technologyen
dc.titleWearable Technology to Assess the Effectiveness of Virtual Reality Training for Drone Operatorsen
dc.typeThesisen
thesis.degree.departmentConstruction Scienceen
thesis.degree.disciplineConstruction Managementen
thesis.degree.grantorTexas A&M Universityen
thesis.degree.nameMaster of Scienceen
thesis.degree.levelMastersen
dc.contributor.committeeMemberAhn, Changbum R
dc.contributor.committeeMemberChaspari, Theodora
dc.type.materialtexten
dc.date.updated2020-09-11T15:43:12Z
local.embargo.terms2021-12-01
local.etdauthor.orcid0000-0003-0390-792X


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record