In my post last week on the IMS Global Consortium conference #LILI15, I suggested that LMS usage in aggregate has not improved academic performance and noted that John Baker from D2L disagreed.
John Baker from D2L disagreed on this subject, and he listed off internal data of 25% or more (I can’t remember detail) improved retention when clients “pick the right LMS”. John clarified after the panel the whole correlation / causation issue, but I’d love to see that data backing up this and other claims.
After the conference I did some checking based on prompts from some helpful readers, and I’m fairly certain that John’s comments referred to Lone Star College – University Park (LSC-UP) and its 24% increase in retention. D2L has been pushing this story recently, first in a blog post and then in a paid webinar hosted by Inside Higher Ed. From the blog post titled “Can an LMS improve retention?” [footnotes and emphasis in original]:
Can an LMS help schools go beyond simply managing learning to actually improving it?
Pioneering institutions like Lone Star College-University Park and Oral Roberts University are using the Brightspace platform to leverage learner performance data in ways that help guide instruction. Now, they’re able to provide students with more personalized opportunities to master content and build self-confidence. The results of their student-centered approach have been nothing short of amazing: For students coming in with zero credits, Lone Star estimates that persistence rates increased 19% between spring 2014 and fall 2014 and Oral Roberts University estimates a persistence rate of 75.5% for online programs, which is an all-time high.
Then in the subsequent IHE webinar page [emphasis added]:
The results have been nothing short of amazing. Lone Star has experienced a 19% increase in persistence and Oral Roberts University has achieved a 75.5% persistence rate for online programs—an all-time high. Foundational to these impressive results is Brightspace by D2L—the world’s first Integrated Learning Platform (ILP)— which has moved far beyond the traditional LMS that, for years, has been focused on simply managing learning instead of improving it.
Then from page 68 of the webinar slides, as presented by LSC-UP president Shah Ardalan:
By partnering with D2L, using the nationally acclaimed ECPS, the Bill & Melinda Gates Foundation, and students who want to innovate, LSC-UP increased retention by 24% after the pilot of 2,000 students was complete.
ECPS and the Pilot
For now let’s ignore the difference between 19%, 24% and my mistake on 25%. I’d take any of those results as institutional evidence of (the right) LMS usage “moving the needle” and improving results1. This description of ECPS got my attention, so I did some more research on ECPS:
The Education and Career Positioning System is a suite of leading web and mobile applications that allow individuals to own, design, and create their education-to-career choices and pathways. The ability to own, design, and create a personal experience is accomplished by accessing, combining and aggregating lifelong personal info, educational records, career knowledge, and labor statistics …
I also called up the LSC-UP Invitation to Innovate program office to understand the pilot. ECPS is an advising and support system created by LCS-UP, and the pilot was partially funded by the Gates Foundation’s Integrated Planning and Advising Services (IPAS) program. The idea is that students do better by understanding their career choices and academic pathways up front rather than being faced with a broad set of options. LCS-UP integrated ECPS into a required course that all entering freshmen (not for transfers) take. Students used ECPS to identify their skills, explore careers, see what these careers would require, etc. LCS-UP made this ECPS usage a part of the entry course. While there is no published report, between Spring 2014 and Fall 2014 LCS-UP reports that increase in term-to-term persistence of 19+%. Quite interesting and encouraging, and kudos to everyone involved. You can find more background on ECPS here.
In the meantime, Lone Star College (the entire system of 92,000+ students) selected D2L and is now using Brightspace as its LMS; however, the ECPS pilot had little to do with LMS usage. The primary intervention was an advising system and course redesign to focus students on understanding career options and related academic pathways.
The Problem Is Marketing, Not Product
To be fair, what if D2L enabled LSC-UP to do the pilot in the first place by some unique platform or integration capabilities? There are two problems with this possible explanation:
- ECPS follows IMS standards (LTI), meaning that any major LMS could have integrated with it; and
- ECPS was not even integrated with D2L during the pilot.
That’s right – D2L is taking a program where there is no evidence that LMS usage was a primary intervention and using the results to market and strongly suggest that using their LMS can “help schools go beyond simply managing learning to actually improving it”. There is no evidence presented2 of D2L’s LMS being “foundational” – it happened to be the LMS during the pilot that centered on ECPS usage.
I should be clear that D2L should rightly be proud of their selection as the Lone Star LMS, and from all appearances the usage of D2L is working for the school. At the very least, D2L is not getting in the way of successful pilots. It’s great to see D2L highlight the excellent work by LSC-UP and their ECPS application as they recently did in another D2L blog post extensively quoting Shah Ardalan:
Lone Star College-University Park’s incoming students are now leveraging ECPS to understand their future career path. This broadens the students’ view, allows them to share and discuss with family and friends, and takes their conversation with the academic and career advisors to a whole new level. “Data analytics and this form of ‘intentional advising’ has become part of our culture,” says Ardalan. “Because the students who really need our help aren’t necessarily the ones who call, this empowers them to make better decisions” he adds.
LSC-UP is also planning to starting using D2L’s analytics package Insights, and they may eventually get to the point where they can take credit for improving performance.
The problem is in misleading marketing. I say misleading because D2L and LSC-UP never come out and say “D2L usage increased retention”. They achieve their goal by clever marketing where the topic is whether D2L and their LMS can increase performance then they share the LSC success story. The reader or listener has to read the fine print or do additional research to understand the details, and most people will not do so.
The higher ed market deserves better.
I Maintain My Position From Conference Panel
After doing this research, I still back up my statement at the IMS panel and from my blog post.
I answered another question by saying that the LMS, with multiple billions invested over 17+ years, has not “moved the needle” on improving educational results. I see the value in providing a necessary academic infrastructure that can enable real gains in select programs or with new tools (e.g. adaptive software for remedial math, competency-based education for working adults), but the best the LMS itself can do is get out of the way – do its job quietly, freeing up faculty time, giving students anytime access to course materials and feedback. In aggregate, I have not seen real academic improvements directly tied to the LMS.
I’m still open to looking at programs that contradict my view, but the D2L claim from Lone Star doesn’t work.
- Although my comments refer to improvements in aggregate, going beyond pilots at individual schools, this claim would nonetheless be impressive. [↩]
- Evidence is based on blog posts, webinar, and articles as well as interview of LSC-UP staff; if D2L can produce evidence supporting their claim I will share it here. [↩]
[…] while all sorts of thoughts were rattling around in my head, Phil Hill’s post took quite a bit of wind out of my sails by articulating very neatly a lot of the stuff that I was […]
[…] Drechsler has a fascinating post in response to my recent LMS as minivan about D2L’s retention claims, mostly playing off of this […]
[…] I wrote a post checking up on a claim by D2L that seems to imply that their learning platform leads to measurable […]
[…] an impressive-sounding but meaningless statistic to emphasize how awesome they are. Phil recently caught John Baker using…questionable retention statistics in a speech he gave. In that case, the problem […]
[…] and I have been quite critical of D2L and their pattern of marketing behavior that is misleading and harmful to the ed […]
[…] and I have made several specific criticisms of D2L’s marketing claims lately culminating in this blog post about examples based […]
[…] easy to judge. Last year I mocked Jay Bhatt pretty soundly for his keynote. (Of course, we have hit D2L a lot harder for their communication issues because theirs have been a lot worse.) In some ways, it […]