Wearable Technology to Assess the Effectiveness of Virtual Reality Training for Drone Operators
Abstract
Unmanned aerial vehicles (UAVs) or drones are being increasing used in many fields including the construction domain for a variety of applications such as surveying, inspection, progress monitoring, surveillance, safety management, and mapping. While drones can add value by lowering the cost and improving the accuracy of data collection, flying drones in congested and constantly evolving environments could also be a precursor to accidents and injuries to site personnel and physical properties, making it necessary to properly train drone operators prior to deploying drones in the field. Virtual reality (VR) simulation has been used for many years as an alternative to real-world training, with past research primarily focused on collision detection and accident prevention techniques with the goal of preventing equipment-equipment and equipment-worker contact collisions. There has also been limited work on understanding, quantifying, and comparing the physiological state of heavy equipment operators. The central hypothesis of this work is that, for similar task complexity a drone operator’s physiological state during a VR flight experiment closely resembles his/her physiological state in a real-world flight scenario. To prove this hypothesis, a methodology is developed for collecting, annotating, and assessing drone operator’s physiological data using wearable devices in both real and virtual environments to determine and compare physiological states that can potentially lead to operators’ errors. In this research, different levels of task complexity are simulated in VR and replicated in an outdoor (OD) environment to collect participants’ data and analyze variations in physiological readings. Statistical analysis between VR and OD sessions shows no significant difference in participants’ physiological features (e.g., mean SCL, mean skin temperature, mean HR, RMSSD) and self-reported scores (e.g., NASA TLX, CARMA video self-feedback).
This indicates that participants had similar experiences (as described by physiological state) while performing under same levels of task complexity in OD and simulated VR. Machine learning task prediction, performance prediction, and stress prediction models are also introduced that are trained on all physiological features and self-reported data. The task prediction model has an accuracy of 75%, and the performance and stress prediction models have prediction errors (i.e., RMSE) of 1.1421 and 0.7578, respectively.
Subject
Drone operationPhysiological state
Performance assessment
Machine learning
Wearable technology
Citation
Sakib, Md Nazmus (2019). Wearable Technology to Assess the Effectiveness of Virtual Reality Training for Drone Operators. Master's thesis, Texas A&M University. Available electronically from https : / /hdl .handle .net /1969 .1 /189173.