File this under “you read it first on e-Literate”.
In previous posts from spring 2013 I provided a graphical view on MOOC student patterns based on observed retention over time as well as differing student types. This graphic was based on anecdotal observations of multiple MOOCs, mostly through Coursera.
Based on a recent study of the edX Circuits and Electronics MOOC, there is this interesting chart of student patterns based on actual data analysis of the 155k students from the spring 2012 offering of this course. The full report is worth reading, by the way, with some real student pattern insights.
I took this chart and overlaid it on the MOOC student patterns graphic, scaling for 0% / 100% of enrollment vertically and start / stop of course horizontally.
It’s good to see this validation of the overall retention pattern based on real data analysis to augment the original graphic’s model.
lauragibbs says
I like the idea that over time, as more data comes in, we can get a more realistic picture of just what is going on in these classes.
In that Coursera Fantasy-SciFi class, which I think has just had its third iteration, we are getting a good picture of what happens when the professor totally loses interest and students in turn realize that the professor has totally lost interest – so, content staling over time, as predicted, is a real problem (unlike textbooks, of course, where students don’t expect to feel “connected” to the author of the textbook who is just that, the absent author, not the supposed “teacher” in some sense of a class actually unfolding in time). I learned about it from this review at MOOC NEWS AND REVIEWS: https://plus.google.com/111474406259561102151/posts/DdtLtGSPGH2
mgozaydin says
In case of degree programs of MOOCs ,
pattern will be completely different .
I estimate at most 20 % drop outs. But also 5,000 -10,000 students per course for credits paying $ 50 or so per course. 50-60 % credit earners .
jwhitmer says
What’s that they say about “great minds”? Not sure the overlay is quite as confirming, but there’s another interesting article from faculty @ Stanford about different patterns of MOOC participation/persistence that is similar to your typology. Recasting our understanding about retention and persistence is one of the really interesting things to me about MOOCs – along with all that real-time student data we can use to train machine learning models, of course …
Kizilcec, René, Piech, Chris, & Schneider, Emily. (2013). Deconstructing Disengagement: Analyzing Learner Subpopulations in Massive Open Online Courses. Paper presented at the LAK ’13, Leuven, Belgium. http://rene.kizilcec.com/wp-content/uploads/2013/02/Kizilcec-Piech-Schneider-2013-Deconstructing-Disengagement-Analyzing-Learner-Subpopulations-in-Massive-Open-Online-Courses.pdf
Phil Hill says
Laura, I wish I had confidence that MOOC data would continue coming in to give this realistic picture. We seem to get some good qualitative data on a consistent basis now (such as the your reviews and the referenced sci-fi review), but quite honestly there should be more quantitative data available. MOOCs were supposed to have the benefit of massive amounts of student transactional data, allowing a learning process to benefit from a missing information source. Unfortunately, the public release of this quantitative data is few and far between:
– This MIT / Harvard study (funded by NSF for $200k)
– The individual school analysis, such as Duke, Edinburgh, etc
– The Stanford typology study that John mentions
– The Coursera EDUCAUSE Review article
Let’s hope that the MOOC Research Initiative (note: I’m on steering committee) leads to much better data in December.
Phil Hill says
Muvaffak, I would assume that the for-credit status of MOOCs as well as skin-in-the-game (fees paid) would change the student patterns, as you describe. The SJSU study is due out next week – this might give some insight.
Phil Hill says
John, I did see that Stanford study and liked the use of clustering. That approach allowed them to follow the data rather than have the strong pre-conceptions as my approach did. That said, the patterns they found (auditing, completing, disengaging, sampling) have some correlations to (observers, active participants, drop-outs, drop-ins). The biggest difference I saw was that their auditing cluster seems to be a combination of my observers / passive participants. But, I might want to also change my definitions to be closer to the Stanford clusters, which have some more validity.
Peter Shea (@pshea99) says
Fascinating analysis here Phil. On the subject of the SJSU study, there seems to be a preview on the Coursera blog. http://blog.udacity.com/2013/08/sebastian-thrun-update-on-our-sjsu-plus.html
The new group of students did much better, in some cases outperforming their classroom counterparts. However, the catch is that the new cohort of students was much stronger than the original. 50% already had college degrees. One in four had bachelors. One in five had more advanced degrees. It’s probably a poor sample from which to generalize conclusions about the application of MOOC-like experiences (Udacity’s SJSU courses are clearly not your typical free Moocs) to the difficulties currently faced by the Cal State System…
I also posted a piece on this topic on the HPLO Blog – http://wp.me/P1SQ8K-5J
Phil Hill says
Peter, thanks for note and link to blog. I covered initial thoughts on SJSU study here, which includes reference to Udacity blog post. Ironically, that Udacity post had the most data anywhere:
https://eliterate.us/sjsu-plus-udacity-pilots-lack-of-transparency/
Peter Shea (@pshea99) says
Wow – excellent analysis on the lack of transparency Phil…much more thorough than what I had to say!