Instructional Design Summit - Part 2: The Great Learning Analytics Debate: Unpacking Promises, Pitfalls, and Progress from a Multi-Stakeholder Perspective

Leadership Equity and Inclusion

Brief Abstract

As learning analytics has become more prominent in our mainstream learning management systems and adaptive learning platforms, there are continuous debates on the topic. This multi-stakeholder debate session brings forth ideation among faculty, researchers, developers, administrators, and designers into the great learning analytics debate on the promises vs. the pitfalls of learning analytics amidst on-going efforts to progress learning analytics platforms/tools, policies, and processes in higher education environments. 

Presenters

Kiran Budhrani is the Director for Personalized and Adaptive Learning at the Center for Teaching and Learning at the University of North Carolina at Charlotte.
J. Garvey Pyke, Ed.D., is the Executive Director of the the Center for Teaching at UNC Charlotte. As part of the leadership team for the School of Professional Studies, his work involves fueling the enrollment growth at the university through online course development, creating high impact student success programs using personalized and adaptive learning, promoting faculty success and scholarly teaching through innovative faculty development programs, and overseeing the provision and support of enterprise academic technologies. Garvey is also an alumnus of OLC's IELOL program (2010) and has remained an active member of this professional community of practice and served as co-director of IELOL 2018 and as a faculty member of IELOL from 2019 - 2022. He has served on various conference committees for OLC Accelerate and has served on the Steering Committee for OLC Innovate.

Extended Abstract

Learning analytics is “the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purpose of understanding and optimizing learning and the environment in which it occurs” (Siemens and Long, 2011, p. 32). In simple terms, learning analytics focuses on capturing and using data to improve student learning and learning environments. As learning analytics has become more prominent in our mainstream learning management systems and adaptive learning platforms, there are continuous debates on the promises vs. the pitfalls of learning analytics amidst on-going efforts to progress learning analytics platforms/tools, policies, and processes in higher education environments. 

Promises: The field is growing with interest from stakeholders to optimize the design processes by providing effective tools to support learners. Capturing data about students, their behaviors and engagement yields patterns that can be further utilized to deepen understanding of how students learn and improve learning processes. This data also helps guide the design processes of instructional materials. In more recent years, personalization of learning is supported by capturing learner profiles and characteristics. 

Pitfalls: There are issues to unpack in learning analytics. Data are often complex and heterogeneous, difficult to be interpreted and translatable by students, faculty, and other stakeholders. For example, at the course level, the practice of translating data into actionable, just-in-time interventions is still uncommon (van Leeuwen, 2019). Although instructional designers perceive the valuable use of learning analytics to inform design practices, challenges exist in comprehending complex data and informing the redesign process. There remains a risk that learning analytics tools may not be sustainable and that the assumptions about their effectiveness may not necessarily meet the needs of students, faculty, and designers (Dawson, Gasevic, & Mirriahi, 2018). One of the most prominent issues that have been identified is whether learning analytics is racially, politically, or otherwise neutral in the way they discriminate and label students based on their performance (Johanes & Lagerstrom, 2017). Despite the promises that learning affords, pitfalls include the ‘gaming’ of learning systems, limiting students’ domains of knowledge and mastery states to system-driven concept maps/trajectories.  

Progress: Amidst the debates of promises and pitfalls, strides in learning analytics have been seen in common learning management systems and adaptive/personalized learning platforms. Dynamic or real-time assessment and feedback is available for students and instructors to view through dashboards. Recommender systems help students find answers to questions quickly. Learning analytics is also perceived to help instructors identify students at risk to provide interventions. Intelligent systems offer individual learning progress as well as a holistic view of classroom performance. 

Goals of the Debate: There is a critical need to understand the bigger picture, identify best practices, and reveal the idealisms of learning analytics towards optimizing learning and student success. This multi-stakeholder session brings forth ideation among faculty, researchers, developers, administrators, and designers into the great learning analytics debate on the promises vs. the pitfalls of learning analytics amidst on-going efforts to progress learning analytics platforms/tools, policies, and processes in higher education environments. 

Participation: We invite participants into a guided, interactive, debate-style conversation with visual facilitation through a digital whiteboard canvas to capture and share ideas from the session.