Gaze Assisted Prediction of Task Difficulty Level and User Activities in an Intelligent Tutoring System (ITS)
Abstract
Efforts toward modernizing education are emphasizing the adoption of Intelligent Tutoring Systems (ITS) to complement conventional teaching methodologies. Intelligent tutoring systems empower instructors to make teaching more engaging by providing a platform to tutor, deliver learning material, and to assess students’ progress. Despite the advantages, existing intelligent tutoring systems do not automatically assess how students engage in problem solving? How do they perceive various activities, while solving a problem? and How much time they spend on each discrete activity leading to the solution?
In this research, we present an eye tracking framework that can assess how eye movements manifest students’ perceived activities and overall engagement in a sketch based Intelligent tutoring system, “Mechanix.” Mechanix guides students in solving truss problems by supporting user initiated feedback. Through an evaluation involving 21 participants, we show the potential of leveraging eye movement data to recognize students’ perceived activities, “reading, gazing at an image, and problem solving,” with an accuracy of 97.12%. We are also able to leverage the user gaze data to classify problems being solved by students as difficult, medium, or hard with an accuracy of more than 80%. In this process, we also identify the key features of eye movement data, and discuss how and why these features vary across different activities.
Subject
eye trackingCitation
Kaul, Purnendu (2016). Gaze Assisted Prediction of Task Difficulty Level and User Activities in an Intelligent Tutoring System (ITS). Master's thesis, Texas A&M University. Available electronically from https : / /hdl .handle .net /1969 .1 /187399.