Note: I will use this space over the next month to share excerpts from my dissertation The Evolution & Impact of the Massive Open Online Course. The research was a Delphi study bringing together 20 MOOC experts to discuss the MOOC in educational, political, and sociocultural terms (slides from the oral presentation can be seen here). Upon library clearance, the entire document will be available through a Creative Commons license. The following is from Chapter 4, the results of the research study. This excerpt tackles the only prompt of 12 to gain consensus in Round 1 of the study, an agreement with the following: The data we gather from students utilizing MOOCs will help us solve student struggles in learning through redesigning the learning system and content modules.
Round 1: Consensus. Only one prompt in Round 1 reached the consensus level of 75%, Prompt #3 data, experts agreeing with the contention that back-end data gathered from MOOCs would help solve learning struggles.
The positive view of data from a consensus majority of the expert panel potentially comes due to the panel’s make-up. Although panelists were chosen from five distinct disciplines, it was the congruence to MOOCs and educational technology that forecast expertise within the phenomenon. Panelists were bullish on back-end data in part because panelists were bullish on the overall confluence of education and technology. Participant E8 stated, “Computer based learning generally, and the whole innovation mindset as brought to teaching and learning, will transform the possibilities for learning research and teaching practice.” Participant E12 added, “The analytics provided by MOOCs (and other online learning) can provide a window into actual student performance – missing in most F2F and online learning today.”
Much of the commentary from experts revolved around the role back-end data would play in the development and role of instructional design in MOOCs. Participant E2 stated, “With analytics on large numbers of learners, designers will recognize which activities and learning modules are working well and which need to be revised.” Added participant E15, “If the feedback loop is set up properly to gather the right data to answer questions about design, it is a good mechanism for improvement through redesign.
Instructional design is the practice of building learning events to assist a student’s mastery of content, a discipline heavily influenced by cognitive science (Mayer, 1992). A criticism of instructional design comes from its systematic/instrumentalist worldview (Gordon & Zemke, 2000) focused on the experience of the designer’s objective rather than a student-centered process of development. Such concerns were echoed in the comments from dissenting voices, as well as by some experts in agreement with the prompt. Focused specifically on MOOCs, participant E6 stated, “…the typical college student does not participate in a MOOC (only 3% of college students have taken a MOOC), so the data collected in MOOCs cannot be easily generalized to the whole population of college students easily.” Participant E14’s criticism was more generalized:
…most of the data gathered is in response to questions or cues formulated not by learners but by designers and instructors. Designers and instructors do not inherently understand learning. They understand design and instruction. I have worked in online learning for over a decade now, and I have yet to see statistics or data generated by an online courses that had [at] their center the learner’s interest. We want to know if we’re winning at instruction, and so we gather data that answers that question. But these sorts of assessments don’t measure learning, they measure instruction.
Perhaps this is why much of the positive response to the prompt was muted or reserved, as experts wrestled with overlapping theoretical approaches to learning. Participant E7 stated, “…information about where students struggle is useful. However, it will not obviate the need for guidance – I think the information sets are too complex and interrelated to be amenable to solutions by machine.”
Gordon, J. & Zemke, R. (2000). The attack on ISD. Training, 37 (4), 44-53.
Mayer, R. (1992). Cognition and instruction: Their historic meeting within educational psychology. Journal of Educational Psychology, 84 (1), 405-412.
Pingback: The Power(ful Perception) of Learning Analytics...