learning analytics, tutoring feedback system, online education
Computer-based tutoring systems have been extensively studied and shown to be generally effective. In this study, I worked with an online beginner’s course that teaches spreadsheets basics to business students. I designed four sets of practice problems that utilized a new tutoring feedback system that was designed for this course. In order to test the effectiveness of this feedback system, 839 Brigham Young University students that utilized this online course were randomly assigned to either a treatment condition where they worked through these new practice problems with the accompanying feedback system, or to a control condition, where they worked through the problems without the feedback system. Data about their performance on these problems, along with data about their performance on later assignments and a class midterm, were collected. ANOVA analyses showed that students in the treatment condition performed significantly better than students in the control condition in 17 of the 48 examined tasks (p < .05), with there being no significant difference in the other 31 tasks. These results indicate that the feedback system had a short-term benefit in many instances. Further research will be needed to determine if additional modification to the feedback system can extend these benefits to long-term situations.
BYU ScholarsArchive Citation
Staples, M. E. (2019). Personalized Feedback: Testing a Tutoring System That Was Informed by Learning Analytics. Unpublished masters project manuscript, Department of Instructional Psychology and Technology, Brigham Young University, Provo, Utah. Retrieved from https://scholarsarchive.byu.edu/ipt_projects/19
David O. McKay School of Education
Instructional Psychology and Technology
Master's Project or PhD Project