I’ve been super-busy with a consulting gig over the last few weeks and have fallen off the wagon with my blogging. This is one of the many reasons that I am grateful to have Phil as my prolific yet profound co-publisher and that we have attracted a group of terrific featured bloggers.
Anyway, I thought I would get back to business with a long-overdue post about D2L’s learning analytics product, called “Insights.” There are several pieces to the product, but I’m going to focus on the component that they call the Student Success System. I have blogged from time to time on Purdue’s Course Signals project, (now commercialized in an offering from Elucian), as having set the bar for student retention analytics. More recently, I wrote about Blackboard’s Retention Center, which is clearly following in Purdue’s footsteps. My impression of Retention Center is that it is a reasonable Version 1 product that captures some but not all of the value of Course Signals.
D2L’s Student Success System also follows in Purdue’s footsteps. But rather than simply playing catch-up, I would call it an incremental but meaningful improvement over Course Signals in most aspects. From what I can tell based on initial conversations with D2L about the product details, this now appears to be the system to beat. (I reserve the right to change my opinion based on implementation experiences from clients, which are particularly important for this product.)
Let’s start by reviewing the Purdue model:
- Purdue discovered that student retention and completion could be strongly predicted by just a few generic indicators within the LMS, such as recency of login, participation in class discussions, assignments turned in on time, and overall grades.
- The accuracy of an early warning predictors could be improved by taking into account a few longitudinal indicators from the SIS such as GPA and entrance exam scores.
- Students who are at risk generally aren’t good at knowing when they are in trouble and should seek help.
- By providing students with an easy-to-read and automated indicator telling them that they are in trouble and should seek help, based on the data elements described above, then at-risk students tend to learn help-seeking behaviors and move themselves out of the at-risk category.
Both Blackboard and D2L have focused mainly on the first two bullet points, while designing their products to be more teacher-focused and broader in purpose than Course Signals. This is a perfectly sensible product design trade-off to make, although it does have some consequences that I will get to later in the post.
There are a few areas where D2L’s product stands out. First, they appear to have put quite a bit of thought and research into their algorithms and even publishing an academic paper on their work. (Paywall; sorry.) One of the tricky problems with this sort of an analytics product is the balance between predictive power and generalizability. Faculty and students use the LMS very differently from course to course. On the one hand, this suggests that the data indicators of an underperforming student in a comparative literature class might be quite different from those in a physics class, or even from one teacher to another within the same subject. On the other hand, it takes time, expertise, and a lot of data to fine-tune a statistical model and make sure that it has any real predictive power. Trying develop predictive capabilities for a 30-person course based on a semester or two of data isn’t going to work.
This is genuinely hard stuff. Blackboard’s strategy seems to have been to stick to the least-common denominators where we know there is some predictive power across a wide variety of course contexts (particularly if you compare a student’s performance on that indicator versus the class average). D2L, on the other hand, has looked across a broader variety of indicators and strung them together into several clusters of concerns—namely, Attendance, Preparation, Participation, and Social Connectedness. This is not blind data mining; these category names make it clear that D2L is coming to the data with relatively specific sets of hypotheses about what sorts of data would be predictors of student success. But they are precisely sets of hypotheses. If I understand their approach correctly, D2L is testing related hypotheses against each other to derive an aggregate predictor for each of these areas. I am not in a position to evaluate how much of a practical difference this approach makes versus Bb’s simpler one in terms of accuracy, but it is certainly a more sophisticated approach. Likewise, D2L’s product can take inputs from the SIS while, as far as I can tell, Bb’s retention product cannot. (Blackboard’s full Analytics product does take SIS data, but there does not appear to be integration with the Retention Center feature of Learn in the current release.)
Another innovative aspect about D2L’s offering is the balance between product and service. One advantage of the sophisticated modeling is that it can be used to create more specific models—maybe not that 30-person comp lit course, but quite possibly that survey biology course that has 1,500 students a semester across multiple sections. For a critical bottleneck course, having a more accurate predictive model that helps you actually get more students successfully through the course could be a big deal. But to do that, you need both tools that the school can use to develop the model and training and support to help them learn how to do it. D2L is offering these. I am very interested in learning more about the practical ability for schools to put them to use. If this works well, it could be a big deal.
Finally, D2L has put some work into the visualizations to help teachers make sense of the data that they are getting. They provide considerably more information than the red/amber/green lights that Course Signals provides, although at some cost to readability at a glance:
The primary focus here is on giving teachers more actionable information on how their students are doing.
The only piece that’s really missing relative to Purdue is the tested student interventions. Purdue’s product is designed specifically to teach students some meta-cognitive skills. While the raw ingredients are there for D2L’s product to do the same, they do not yet have the refined and tested student-facing dashboards and call-to-action messages that Course Signals does.
So, overall, this looks like a very nice product that adds to the state of the art. But the devil is in the (implementation) details. I’m going to try to talk to some D2L clients who are actually using this product next week when Phil and I are at D2L FUSION.