In my last post, I promised that I would give an update specifically on the state of Blackboard’s learning analytics. Well, here you go. This is a summary of what I learned about their product from a chat with Mark Max, Blackboard’s VP of Learning Analytics and, to a lesser degree, with VP of User Experience Stephanie Weeks. I wrote about Blackboard’s Retention Center product some time ago. That product (or feature set, since it is free in Blackboard) directly competes with Desire2Learn’s Student Success System. This post is more broadly about their Analytics product suite, which is most directly analogous with Desire2Learn’s Insights product, although it is actually much, much broader in scope.
The short version is this: Blackboard has very solid and reliable technology base from which they are building their learning analytics. It is easily the most mature platform among the LMS providers from that perspective. What they are a little short on is vision. In other words, they are pretty much the mirror image of Desire2Learn.
iStrategy
In order to understand Blackboard’s analytics position, you have to know a little bit about Mark Max. Mark was a manager at PricewaterhouseCoopers in the 1980s and a finance director at one of the BlueCross BlueShields in the 1990s. Those were pretty much the perfect jobs to get early exposure to analytics technologies (which in those days was mainly data warehouses and OLAP). In the early days, these technologies were invariably big, complex, expensive, and only used by the largest of companies. The focus of the technologies in those days was on rapid and accurate reporting. Data mining, which is one of the foundational approaches that led to machine learning, was starting to become something that people talked about beyond research circles, but it didn’t really hit its stride until the 2000s.
At the time that Mark co-founded iStrategy in 1999, one of the trends in data warehouses was to build focused, pre-built products that ran a bunch of useful reports out-of-the-box and cost at most 25% of what a big, generic enterprise data warehouse system would cost. There were two basic elements to the value proposition: cost and speed. Coming in with a more limited product that did exactly and (mostly) only what customers wanted at first brought the cost down, sometimes as much as by an order of magnitude, thus making it affordable for the first time to organizations who previously couldn’t think about it. But it also made implementation much faster and demanded a lot less of the implementing organization. It could be up and running in weeks or months rather than years. Again, the primary value of these warehouses was reporting, by which I mean collating data from different IT systems into timely and accurate reports. For example, if a campus wanted to see the average class size by department or subject or the average in-major GPA by academic cohort, this information takes a lot of time and effort to get without a reporting system, and often is stale and inaccurate if that system isn’t pulling directly from the various systems of record. This is largely boring stuff which has nothing directly to do with learning, but uninteresting is not the same as unimportant. Many of these reports and necessary to help ensure that the college is healthy and its students successful. And by all accounts, iStrategy did and does this job very well. When I worked at Oracle in the higher education division, that group had a very healthy respect for the company and the product. So when Blackboard acquired iStrategy in 2011, they bought a company with a very mature product and an executive with a proven track record in old-school analytics.
Learning Analytics
When you talk to Mark about what can be done with learning analytics, he definitely talks in terms that reflect his background. A lot of the ideas that he brings up are correlative, such as the ratio of instructor posts to student posts. It’s all about juxtaposing different bits of information in different ways that make you think. On the other hand, he seems almost dismissive of machine learning techniques, which he referred to several times as “black box analytics.” He talked about the high correlation between student activity in the LMS and performance, and expressed skepticism that any fancy-pants machine learning was going to squeeze a whole lot more predictive juice out of the data than simple correlation would.
This is a fundamentally different approach than D2L’s emphasis on machine learning. In the latter case, the idea is that if we teach the machines the distinctions that we have figured out through the kind of data juxtapositions that a tool like an OLAP cube allows, but then have the machine do statistical analysis based on those insights across a whole range of data, then it may find new and important juxtapositions or relationships that we miss with the human eye. For example, we may have some insights about how student preparation, effort, and help-seeking behaviors contribute to student success, and we may have some ideas about how those dimensions get represented in the data. But the machine may be able to figure out that, for example, a student’s residential distance from campus is more predictive of their academic success at a particular institution than her student loan load (maybe because of the particular challenges of getting physically to that campus). Or maybe there are three different factors that, when looked at in combination, have most of the predictive value of the student’s background coming into the class. Those aren’t the sorts of learning analytics questions that Blackboard gravitates toward. I don’t want to overstate the case; I have no reason to doubt that Mark and his staff have experience with data mining and similar technologies. But it’s not what they think of as a core value proposition. This mindset shows up in Retention Center, which is clearly a reporting tool rather than the predictive tool that D2L’s Student Success System aspires to be.
Speaking of which, I find it interesting that despite these two very different approaches by Blackboard and Desire2Learn, and despite the obvious influence of Purdue’s work on both of them, neither company chose to imitate Course Signals in terms of being student-focused and finely honed to drive course completion. My first instinct was that this was a common problem between the two companies of simply not talking to enough customers, but I no longer believe it’s that simple. Stephanie Weeks told me that Blackboard focused the first release of Retention Center of teachers because “we believe in teachers and trust them to do the right thing with the information we’re giving them.” That’s a fine sentiment, but I suspect that it is not the whole story. I often get asked when I think that LMS providers will build more student-centric systems. My answer is usually something along the lines of, “When students become influential on LMS selection committees.”
That aside, I think we are set for an interesting battle between Blackboard and Desire2Learn in the area of analytics over the next year or two. Both companies are coming to the table with substantial and very different strengths.
lauragibbs says
Thanks as always for the reporting… and, as always, I find this so totally alien from the data approach that I would take if I could wave my magic wand and turn Desire2Learn into a flexible system for gathering data that I would want to know, and for reporting that data both to me and to the students. The most important reporting, in my opinion anyway, is the data that goes straight back to the students, independent of teacher or institutional mediation. Thank you for talking about that issue here. It troubles me A LOT. And for MOOCs, it’s obvious that we have to be student-centric. So maybe that will be an issue that the MOOCs will affect positively (and heck, we should get at least something good out of all the MOOC frenzy).
Here’s my question: for the kinds of classes I teach, it’s obvious to me that the key to student success is ENGAGEMENT (are students bored, or not?) and TIME MANAGEMENT (are students making good choices about how they spend their time, and are they spending enough time?). I’m guessing that the systems can try to do a good job of monitoring time spent, coming up with measures of student time spent on different tasks, proximity of that time to the deadline associated with a task, etc. (although D2L gives me NOTHING like that right now…) – but what about engagement? Has anybody built a system that automatically facilitates student rating of everything they do on a simple scale, with something like “Did you enjoy this assignment?” (no, sort of, yes, very much) and “Did you learn something from this assignment?” (not sure, no, maybe, yes). That’s something I would love to see, and I have not seen it anywhere.
lauragibbs says
P.S. I should add: that kind of data is not so much useful to the students, but it would be incredibly useful to me, the teacher. How can I know where to target my efforts to improve the course and course materials if I don’t know what the students think about each and every item…? And a big part of improving student success in a course is not about changing the students so much as it is about changing the course to make it better! Again, IMHO. 🙂
Education Cowboy says
Michael,
Thanks, as always, for providing such detailed and lucid information on this topic. Also, to Laura, I agree with your comment on the key to student success being engagement. Almost a decade ago, Andy Peterson and I presented a paper on environmental scanning as part of teaching online courses, and the tools we had available to measure and influence real student engagement. I find it somewhat fascinating that we have progressed so far in some ways but continue to struggle in others where this is concerned.
Michael Feldstein says
I’ll start by reposting the reply I gave to Laura on G+:
Rob I agree that the sort of amazing progress/utter lack of progress split is somewhat head-spinning.
Chris Munzo says
You have to have the LMS to get the embedded analytics. I would like to hear from Blackboard and D2L management about whether the analytics are key selection criteria in winning new clients and keeping existing ones. Or are they just one of many features that are equally weighted in LMS evaluations? In other words, is the inevitable increase in the base price of the LMS worth the investment?
Eilif Trondsen says
Thanks for addressing a topic that obviously gets a lot of interest these days, Michael. And perhaps in a future post, you can take a shot at analytics in a MOOC context, and perhaps review what the LyticsLab.org at Stanford–and similar “labs” that I suspect other institutions are setting up–plan to do, or hope to accomplish with all the MOOC data they hope to capture.
I was intrigued by the contrasts you presented about the D2L and Bb analytics “world view” and approach and I am curious about how Instructure/Canvas compares to D2L and Bb in terms of what they offer and how they approach analytics.
I assume you saw the piece by Joshua Kim the other day in IHE on Abilene Christian University and their decision to go with OpenClass. Pearson is of course very focused on analytics, and is working closely with Newton in the ASU project, but I wonder if OpenClass provides much in terms of analytics?
I look forward to future posts and enjoy everything you and Phil write.
Jon says
Michael,
As always, I appreciate your reporting and look forward to learning more about how all the LMS providers are approaching the Analytics topic.
I am trying to get a grip on exactly what value D2L is bringing to the table. I have been using D2L for the past few semesters. Prior to that our institution used ANGEL for 6-7 years. To be sure, this was not a popular move at our institution due to the loss of functionality and the un-intuitive user interface many of us power-users have discovered. (I will reserve my rant about the gradebook for another day)
From my experience, the functions provided to faculty via the D2L “Analytics Portal” do not even approach the basic reporting found ANGEL. I also use Moodle as a training platform. Again, Moodle reporting exceeds the capabilities of D2L.
Inspired by your blog post I decided to do a test. Here is what I found:
On the first day of class, I had 3 student emails in my D2L inbox. Yet according to the D2L Analytics reports, none of my student had logged in. I further compared the Class Progress Report with the User Course Access Report from Analytics and again found the same disconnect.
Data integrity and validation are very important issues. If you are going to make decisions based on the data, it better be accurate. Based on these reports, I would be uncomfortable making any kind of decision that would impact my students’ grades or enrollment status. Since I am not an administrator, I cannot speak to the integrity of the data l above the “course level”, but it is safe to assume that accuracy begins at the base level.
To be sure, Learning Analytics is becoming one of the most important issues in education. Data is a great and terrible thing depending on how it is used/interpreted.
Here is a beautiful quote to ponder….
“What makes a scientist great is the care that he takes in telling you what is wrong with his results, so that you will not misuse them. “ …W. Edwards Deming
I challenge you in future reporting to take us deeper into all of the Analytics products so we can have a better understanding of their applicability.
Than you for you continued brilliance in reporting.
Jon
Michael Feldstein says
Thanks, Jon. We are definitely interested in reporting on the kind of deep dive product reviews you are asking for, but it is difficult from both time and access perspectives. We’ll keep thinking about how we might accomplish something like this.
Phil Hill says
Jon, I would add that what you describe fits into, and gives additional details on, Michael’s comment that “We heard some hints of data integrity problems as well, but nothing too specific or definitive” (from D2L Analytics Update post).
You also make great points that “Data integrity and validation are very important issues. If you are going to make decisions based on the data, it better be accurate.” followed by “Data is a great and terrible thing depending on how it is used/interpreted.”