In my last post, I promised that I would give an update specifically on the state of Blackboard's learning analytics. Well, here you go. This is a summary of what I learned about their product from a chat with Mark Max, Blackboard's VP of Learning Analytics and, to a lesser degree, with VP of User Experience Stephanie Weeks. I wrote about Blackboard's Retention Center product some time ago. That product (or feature set, since it is free in Blackboard) directly competes with Desire2Learn's Student Success System. This post is more broadly about their Analytics product suite, which is most directly analogous with Desire2Learn's Insights product, although it is actually much, much broader in scope.
The short version is this: Blackboard has very solid and reliable technology base from which they are building their learning analytics. It is easily the most mature platform among the LMS providers from that perspective. What they are a little short on is vision. In other words, they are pretty much the mirror image of Desire2Learn.
In order to understand Blackboard's analytics position, you have to know a little bit about Mark Max. Mark was a manager at PricewaterhouseCoopers in the 1980s and a finance director at one of the BlueCross BlueShields in the 1990s. Those were pretty much the perfect jobs to get early exposure to analytics technologies (which in those days was mainly data warehouses and OLAP). In the early days, these technologies were invariably big, complex, expensive, and only used by the largest of companies. The focus of the technologies in those days was on rapid and accurate reporting. Data mining, which is one of the foundational approaches that led to machine learning, was starting to become something that people talked about beyond research circles, but it didn't really hit its stride until the 2000s.
At the time that Mark co-founded iStrategy in 1999, one of the trends in data warehouses was to build focused, pre-built products that ran a bunch of useful reports out-of-the-box and cost at most 25% of what a big, generic enterprise data warehouse system would cost. There were two basic elements to the value proposition: cost and speed. Coming in with a more limited product that did exactly and (mostly) only what customers wanted at first brought the cost down, sometimes as much as by an order of magnitude, thus making it affordable for the first time to organizations who previously couldn't think about it. But it also made implementation much faster and demanded a lot less of the implementing organization. It could be up and running in weeks or months rather than years. Again, the primary value of these warehouses was reporting, by which I mean collating data from different IT systems into timely and accurate reports. For example, if a campus wanted to see the average class size by department or subject or the average in-major GPA by academic cohort, this information takes a lot of time and effort to get without a reporting system, and often is stale and inaccurate if that system isn't pulling directly from the various systems of record. This is largely boring stuff which has nothing directly to do with learning, but uninteresting is not the same as unimportant. Many of these reports and necessary to help ensure that the college is healthy and its students successful. And by all accounts, iStrategy did and does this job very well. When I worked at Oracle in the higher education division, that group had a very healthy respect for the company and the product. So when Blackboard acquired iStrategy in 2011, they bought a company with a very mature product and an executive with a proven track record in old-school analytics.
When you talk to Mark about what can be done with learning analytics, he definitely talks in terms that reflect his background. A lot of the ideas that he brings up are correlative, such as the ratio of instructor posts to student posts. It's all about juxtaposing different bits of information in different ways that make you think. On the other hand, he seems almost dismissive of machine learning techniques, which he referred to several times as "black box analytics." He talked about the high correlation between student activity in the LMS and performance, and expressed skepticism that any fancy-pants machine learning was going to squeeze a whole lot more predictive juice out of the data than simple correlation would.
This is a fundamentally different approach than D2L's emphasis on machine learning. In the latter case, the idea is that if we teach the machines the distinctions that we have figured out through the kind of data juxtapositions that a tool like an OLAP cube allows, but then have the machine do statistical analysis based on those insights across a whole range of data, then it may find new and important juxtapositions or relationships that we miss with the human eye. For example, we may have some insights about how student preparation, effort, and help-seeking behaviors contribute to student success, and we may have some ideas about how those dimensions get represented in the data. But the machine may be able to figure out that, for example, a student's residential distance from campus is more predictive of their academic success at a particular institution than her student loan load (maybe because of the particular challenges of getting physically to that campus). Or maybe there are three different factors that, when looked at in combination, have most of the predictive value of the student's background coming into the class. Those aren't the sorts of learning analytics questions that Blackboard gravitates toward. I don't want to overstate the case; I have no reason to doubt that Mark and his staff have experience with data mining and similar technologies. But it's not what they think of as a core value proposition. This mindset shows up in Retention Center, which is clearly a reporting tool rather than the predictive tool that D2L's Student Success System aspires to be.
Speaking of which, I find it interesting that despite these two very different approaches by Blackboard and Desire2Learn, and despite the obvious influence of Purdue's work on both of them, neither company chose to imitate Course Signals in terms of being student-focused and finely honed to drive course completion. My first instinct was that this was a common problem between the two companies of simply not talking to enough customers, but I no longer believe it's that simple. Stephanie Weeks told me that Blackboard focused the first release of Retention Center of teachers because "we believe in teachers and trust them to do the right thing with the information we're giving them." That's a fine sentiment, but I suspect that it is not the whole story. I often get asked when I think that LMS providers will build more student-centric systems. My answer is usually something along the lines of, "When students become influential on LMS selection committees."
That aside, I think we are set for an interesting battle between Blackboard and Desire2Learn in the area of analytics over the next year or two. Both companies are coming to the table with substantial and very different strengths.