Evaluating Importance Ratings as an Alternative to Mental Models in Predicting Driving Crashes and Moving Violations
MetadataShow full item record
The present study investigated the extent to which importance ratings (i.e., a measure of perceived importance for driving-related concepts) are a viable alternative to traditional mental model assessment methods in predicting driving performance. Although mental models may predict driving–related outcomes—crash involvement and moving violations—common mental model assessment techniques are associated with administrative limitations and challenges, which can affect how valid mental models are as assessments of knowledge structure. Importance ratings, as a measure of driving-related knowledge that may be associated with fewer administrative limitations, were hypothesized to provide equal predictive validity for driving–related performance outcomes in a sample of undergraduate students. To investigate the extent to which the measurement of mental models and importance ratings contribute to the prediction of driving crashes and moving violations, students completed Pathfinder, a common computer-based mental model assessment method, and paper-and-pencil importance ratings. In addition, students completed a test of driving knowledge and reported driving behaviors and outcomes including at-fault crashes and moving violations that occurred over the past five years (i.e., from 2005 to 2009). A group of expert drivers completed mental model and importance ratings assessments as well. Data across expert raters were combined and analyzed for appropriateness to serve as referent scores for each assessment. Students' mental model accuracy as well as importance rating accuracy was based on the extent to which student mental models and ratings agreed with those provided by the group of expert drivers. The results suggest that importance rating and mental model accuracy predicted crash involvement and moving violations. Whereas mental model accuracy was a stronger predictor of the number of moving violations, importance rating accuracy predicted the number of at-fault crashes slightly better than mental models. Although inconclusive, these results suggest that importance ratings may be a viable alternative to traditional mental model assessment in predicting some driving outcomes. Future research is warranted on importance ratings and other alternative mental model assessments.
McDonald, Jennifer Nicole (2011). Evaluating Importance Ratings as an Alternative to Mental Models in Predicting Driving Crashes and Moving Violations. Master's thesis, Texas A&M University. Available electronically from