ISCAP Proceedings: Abstract Presentation
Exploring the Relationship Between Digital Engagement and Student Achievement-A Case Study using Cengage MindTap Analytics
Wei Wei
University of Houston-Downtown
Shohreh Hashemi
University of Houston Downtown
Shuaifu Lin
University of Houston-Downtown
Tahereh Jafari
University of Houston-Downtown
Abstract
In the digital transformation of higher education, learning analytics has emerged as a powerful approach to understanding student behavior and enhancing instructional practices. This study investigates how engagement data from Cengage MindTap—a widely adopted digital learning platform—can be used to predict academic performance in an entry-level Management Information Systems (MIS) course for undergraduate business students at a large urban university.
The course, offered in various formats (in-person, hybrid, synchronous online, and asynchronous online), integrates MindTap as a core instructional tool. Students engage with interactive activities such as concept videos, chapter quizzes, Excel and Access training, and exams, all of which contribute to their final course grade. Despite course coordination and digital support, student performance has historically been suboptimal, prompting a need to better understand the relationship between digital engagement and learning outcomes.
Using a dataset of anonymized student records collected over multiple semesters and across different teaching modalities, this research employs a combination of exploratory data analysis and machine learning modeling. Key engagement metrics from MindTap—such as time-on-task, number of logins, first login date, activity completion rates, and MindTap’s built-in engagement score—are analyzed in relation to students’ major assessment scores and final grades. Statistical analyses are conducted to determine if student outcomes are significantly different across modalities and semesters, ensuring comparability of pooled data.
Several machine learning models are trained and evaluated to predict student performance, including logistic regression, decision trees, random forests, and support vector machines. These models are assessed using accuracy, precision, recall, and AUC metrics to identify which engagement factors are most predictive of student success and which models perform best in classifying at-risk students early in the term.
Findings indicate that certain patterns of digital engagement—especially early login activity, consistent time-on-task, and high activity completion—are significantly associated with better course outcomes. The machine learning models demonstrate promising accuracy in forecasting performance, offering potential for developing early intervention systems that identify struggling students before major assessments.
The implications of this work are threefold: (1) providing instructors with actionable insights to support timely pedagogical interventions; (2) informing curriculum design by identifying content or delivery areas that may benefit from restructuring; and (3) contributing to the broader discourse on how commercial digital platforms like MindTap can evolve to offer more granular and useful analytics to educators.
This study contributes to the growing body of literature on educational data mining and learning analytics in business education. It also highlights key limitations, including the absence of demographic and prior academic performance variables, and the non-causal nature of machine learning predictions. Future work will explore incorporating additional data sources and other machine learning methods. In addition, the actionable findings will be incorporated into performance monitoring dashboard to augment teaching and learning.