A few weeks back, when blogging about Desire2Learn’s new Student Success System retention early warning analytics product, I wrote,
From what I can tell based on initial conversations with D2L about the product details, this now appears to be the system to beat. (I reserve the right to change my opinion based on implementation experiences from clients, which are particularly important for this product.)…
I’m going to try to talk to some D2L clients who are actually using this product next week when Phil and I are at D2L FUSION.
And then the other day, describing my overall impressions of the FUSION conference, I wrote,
Historically, my take on D2L has been as follows: On the good side, they have coherent product vision and their own take on the LMS space—I have a lot of respect for Ken Chapman as a product guy—and good relationships with their customers….On the bad side, D2L has not always been particularly good at executing technically difficult projects, and they have not always had a good sense of how well they are performing in that regard—what they have achieved, how long it will take them to deliver functionality, how serious the problems are, and so on….
[T]he point is that D2L seems to be making hires that have the potential to shore up their historic weaknesses. In the next 12-24 months, we will see whether these new hires are empowered to make a real difference in D2L’s performance.
It turns out that analytics is a perfect example the overall problems I described in the latter post: good product vision, serious technical implementation problems that have been understated and probably underestimated, and relatively new technical managers that seem to have some hope of getting the issues straightened out.
The Lack of Insights
In my first post on D2L analytics product suite, I mostly wrote about the Student Success System (or S3), which is the component that follows in the footsteps of Purdue’s Course Signals. But S3 is the tip of the analytics iceberg. The core system that enables S3 to do its magic is called Insights. Analytics engines generally have three basic functions: (1) getting the data from the source systems, (2) storing the data in a way that makes running queries over massive amounts of information practical, and (3) actually generating useful reports from the data. Each of these functions is hard to do in a performant way, in large part because of the enormous quantities of data that need to be moved and manipulated quickly. All in all, the technical challenges of building a full-blown analytics system are probably an order of magnitude harder those of building an LMS. (This is a gross overgeneralization, but it’s close enough to being true for our purposes.) So it would not be terribly surprising to learn that D2L—a company that has struggled at times with complex technical challenges—would hit some bumps in the road while building out an analytics platform.
And they have. Significant ones. Phil and I managed to talk to a few Insights customers, both before and during the conference. What we heard was pretty serious. The main problem isn’t the functional design; it’s the performance. The system is way too slow, both in terms of getting the data to work with and in terms of rendering the reports. (We heard some hints of data integrity problems as well, but nothing too specific or definitive.) The issues are severe enough to go beyond annoyances to the point where they are impacting customer go-forward plans with the product. The consistent story we heard was that customers were having to completely rethink schedules for their analytics-based initiatives, or even put them on hold altogether, because they couldn’t get Insights to do what they needed it to do. I’m not in a position to say that this is the consensus view of all Insights customers—in fact, I doubt that it is—but we got enough from our sample that I’m convinced this is a pattern and not a fluke.
The analytics product team at Desire2Learn acknowledged that there are performance issues. They also provided us with some details about where they believe the problems are how they intend to address them. According to D2L, current performance problems are in the first and third parts of the system that I described above. In terms of getting the data from the target systems, Insights’ Extract-Transfer-Load (ETL) process can take 24 hours or more. This is the area that the company appears to have made most progress on and claims to have already fixed for the majority of their hosted customers. They also acknowledge significant issues with the reporting system. We have been told by D2L that they expect to solve the problems in both of these components by the end of the year, and that their next product release will have no new features, focused solely on performance improvement.
They do not believe that the middle component—data storage—is contributing to the performance problems. That said, they have aspirations a little further down the road to move from a pure data warehouse design to something a little more heterogenous, in part to make sure that they don’t hit a bottleneck in that part of the product going forward. I’m going to try to avoid going down a technical rabbit hole here, but briefly, it’s worth noting that there is an old school way of doing analytics data storage (think Oracle) and a new school way (think Google). The old school way is very performant once you get the data into the system, but it can be slow work getting the data in, and it’s also relatively inflexible in terms of letting you ask questions of it that you hadn’t thought of at design time. The new school way can much more performant at getting at data (particularly unstructured data) from source systems and lets you ask new questions whenever you think of them, but it’s often not as performant as the old school way at answering the questions that the particular data warehouse was designed to be able to answer. That’s why the two methods are often employed together. Insights is currently pure old school. Adding a new school Hadoop-style component will both help D2L get to the data faster, particularly when dealing with sources like web server log files, while also broadening the capabilities of the product.
Responsiveness
I am not an expert in the architecture of analytics systems, but everything D2L has been telling us regarding their plans seems credible to me. The customers we talked to apparently feel similarly. We are not getting the same vibe that we got from Blackboard customers five or six years ago, before Ray Henderson arrived. They are, by and large, willing to cut the company some slack. The main frustration seems to be that they were given the impression that the product is further along than they feel it actually is. They also seem to have a reasonably high level of trust in the product team to work out the issues. One customer specifically mentioned that the product originated before the current product leadership arrived, implying that the at least some of the issues that the team is sorting through are ones that they inherited. The bigger question is whether the top company leadership will start providing more conservative and realistic characterizations of the state of the product line.
There is a secondary issue that’s also worth bringing up. One of the comments we got from D2L’s Insights product team is that different customers have very different use cases in mind for the software and therefore very different expectations for the product. I’m sure that’s true, but it suggests that D2L could do a better job of focusing on a narrower range of use cases (and customers) and satisfying those first. There are hints of this problem in the design of S3 as well. For example, rather than providing a broad range of visualizations for different purposes, D2L could have made the main goal of the first release to promote help-seeking behavior the way Course Signals does and invested some development velocity into creating alert messages aimed at producing measurable changes in retention outcomes. In the current release, they check a lot of boxes in terms of features that could be useful, but I’m not sure that they’ve quite hit the bull’s eye for solving any one specific teacher or student problem.
It would be hard to overstate just how far-reaching the effects of such seemingly philosophical development choices can be. Even the perception of whether performance is “acceptable” is driven by the user goals. There are plenty of analytics use cases for which working with data that is a day or even a week old is just fine. Retention early warning analytics is not one of those use cases. If you’re trying to identify an at-risk student within the first three weeks of class, then being days behind in data is simply unacceptable. If D2L wanted focus on delivering the best early warning system in the industry, they could have sacrificed effort put into features or reports and invested more developer time in ETL performance. If they anticipated that ETL performance was going to be a problem in early releases, then they could have focused their efforts on solving problems that can be solved with day-old data and improving ETL performance before they attempted early warning analytics.
Take this as a hypothetical example rather than a real criticism of D2L’s actual choices. Prioritization decisions are very complex, and I know very little about the factors that led the Insights team to structure their work backlog they way that they did. The general point is that I see signs that D2L as a company tends to have an enterprise-y approach to product design, and that may have hurt them in this and other cases.
This is pretty typical to the way that big enterprise software companies tend to approach product development; I can’t say that D2L is significantly worse in this regard than everybody else. In fact, Blackboard made essentially the same prioritization decision with its own retention analytics feature set. In general, the enterprise-y approach is very good at delivering features but not nearly as good at solving customer problems. It has been my experience in recent years that the big ed tech companies can gain a lot from the focus that consumer web and mobile tech companies have developed, drawing from Agile software engineering and Lean Startup product design methodologies. While more and more of these companies are adopting aspects of Agile, they tend to miss the fundamental change in the design approach that is at the heart of the method. The basic idea is to consider your product designs to be hypotheses, test them early and often—very early and very often—and focus on solving one customer problem well before you move onto another one. It’s more complicated than that, and there’s a lot of craft that goes into executing it competently. But it all starts with a commitment to being relentlessly focused on finding one real, specific, and significant customer problem and studying it until you understand it well enough to solve it well. Doing this is hard for product teams in big ed tech companies, both because their sales and marketing teams pull them in lots of different directions to try to satisfy a very heterogenous customer base, and because some of the more successful companies have product visionaries whose instincts have been good enough that they can get away without being so relentlessly empirical and don’t realize what they’re missing. The good news is that it is very much possible for product visionaries to adopt these new methodologies and still bring their special sauce to the design, although they may have to unlearn some habits that have previously worked for them. Fighting off the sales and marketing beast is harder, particularly in an organization that has an uneven track record of being self-aware regarding the state of its products.
Anyway, I see the D2L’s analytics situation as good test case for whether the company is going to be able to listen to its new leaders and shore up its weak points. We should know more about the Insights team’s ability to get the performance problems under control in the next six months. And I still believe that S3 promises to significantly advance the state of the art in early warning analytics. So, while the product team has some serious work to do in order to restore customer confidence, that could be turned around fairly quickly. The larger question of the new hires are able to help the company’s leadership to develop the self-awareness they need to take the company to the next level is more complex to gauge and may take more time to prove out.
Stay tuned.
Chris Munzo says
Do schools expect to pay extra for these analytics or do they expect the capability to be included in the base price of the LMS? It appears that the development effort to build them is significant.
Michael Gibson says
Seeing well designed predictive student risk models utilise a holistic dataset that incorporates data well beyond the LMS, I’m still a little confused about how the D2L offering can be deployed without a lot of customisation? Or does it simply ignore many, potentially important predictors?
We’re pursuing a ‘home grown’ model, as opposed to a ‘bought’ one (even perhaps in combination with), which we think will produce a much better result.
I wonder how others feel about this question?
Mick