Update 5/17/16: I made a mistake in my math on the UW Milwaukee improvements. The number of A’s and B’s increased 163% increased 220% for “unprepared” students and 170% for “prepared” students. I apologize for the error.
Update 7/23: Read this blog post for D2L admission of mistakes and changes to claims.
At this point I’d say that we have established a pattern of behavior.
Michael and I have been quite critical of D2L and their pattern of marketing behavior that is misleading and harmful to the ed tech community. Michael put it best:
I can’t remember the last time I read one of D2L’s announcements without rolling my eyes. I used to have respect for the company, but now I have to make a conscious effort not to dismiss any of their pronouncements out-of-hand. Not because I think it’s impossible that they might be doing good work, but because they force me to dive into a mountain of horseshit in the hopes of finding a nugget of gold at the bottom. Every. Single. Time. I’m not sure how much of the problem is that they have decided that they need to be disingenuous because they are under threat from Instructure or under pressure from investors and how much of it is that they are genuinely deluding themselves. Sadly, there have been some signs that at least part of the problem is the latter situation, which is a lot harder to fix. But there is also a fundamental dishonesty in the way that these statistics have been presented.
Well, here’s the latest. John Baker put out a blog called This Isn’t Your Dad’s Distance Learning Program with this theme:
But rather than talking about products, I think it’s important to talk about principles. I believe that if we’re going to use education technology to close the attainment gap, it has to deliver results. That — as pragmatic as it is — is the main guiding principle.
The link about “deliver results” leads to this page (excerpted as it existed prior to June 30th, for reasons that will become apparent).
Why Brightspace? Results.
So the stage is set – use ed tech to deliver results, and Brightspace (D2L’s learning platform, or LMS) delivers results. Now we come to the proof, including these two examples.
According to Californiat State University-Long Beach, retention has improved 6% year-over-year since they adopted Brightspace.[snip]
University of Wisconsin-Milwaukee reported an increase in the number of students getting A’s and B’s in Brightspace-powered courses by over 170%
Great results, no? Let’s check the sources. Ah . . . clever marketing folks – no supporting data or even hyperlinks to learn more. Let’s just accept their claims and move along.
. . .
OK, that was a joke.
CSU Long Beach
I contacted CSU Long Beach to learn more, but I could find no one who knew where this data came from or even that D2L was making this claim. I shared the links and context, and they went off to explore. Today I get a message saying that the issue has been resolved, but that CSU Long Beach would make no public statements on the matter. Fair enough – the observations below are my own.
If you now look at that Results page now, the CSU Long Beach claim is no longer there – down the memory hole1 with no explanation, replaced by a new claim about Mohawk College.
While CSU Long Beach would not comment further on the situation, there are only two plausible explanations for the issue being resolved by D2L taking down the data. Either D2L was using legitimate data that they were not authorized to use (best case scenario) or D2L was using data that doesn’t really exist. I could speculate further, but the onus should be on D2L since they are the ones who made the claim.
UW Milwaukee
I also contacted UW Milwaukee to learn more, and I believe the data in question is from the U-Pace program which has been fully documented.23
The U-Pace instructional approach combnes self-paced, master-based learning with instructor-initiated Amplified Assistance in an online environment.
The control group was traditionally-taught (read that as large lecture classes) for Intro to Psychology.
From the EDUCAUSE Quarterly article on U-Pace, the number of A’s and B’s increased 163% increased 220% for “unprepared” students and 170% for “prepared” students. This is the closest data I can find to back up D2L’s claim of 170% increase.
There are three immediate problems here (ignoring the fact that I can’t find improvements of more than 170% – I’ll take 163%).
Update 7/23: D2L adds case study showing previous results from U-Pace that show 179% improvements.
- First, the data claim is missing the context of “for underprepared students” who exhibited much higher gains than prepared students. That’s a great result for the U-Pace program, but it is also important context to include.
- The program is an instructional change, moving from large lecture classes to self-paced, mastery-learning approach. That is the intervention, not the use of the LMS. In fact, D2L was the LMS used in both the control group and the U-Pace treatment group.
- The program goes out of its way to call out the minimal technology needed to adopt the approach, and they even list Blackboard, Desire2Learn, and Moodle as examples of LMS’s that work with the following conditions:
This is an instructional approach that claims to be LMS neutral with D2L’s Brightspace used in both the control group and treatment group, yet D2L positions the results as proof that Brightspace gets results! It’s wonderful that Brightspace LMS worked during the test and did not get in the way, but that is a far cry from Brightspace “delivering results”.
The Pattern
We have to now add these two cases to the Lone Star College and LeaP examples. In all cases, there is a pattern.
- D2L makes marketing claim implying their LMS Brightspace delivers results, referring to academic outcomes data with missing supporting data or references.
- I contact school or research group to learn more.
- Data is either misleading (treatment group is not LMS usage but instead instructional approach, adaptive learning technology, or student support software) or just plain wrong (with data taken down).
- In all cases, the results could have been presented honestly, showing the appropriate context, links for further reading, and explanation of the LMS role. But they were not presented honestly.
- e-Literate blog post almost writes itself.
- D2L moves on to make their next claim, with no explanations.
I understand that other ed tech vendors make marketing claims that cannot always be tied to reality, but these examples cross a line. They misuse and misrepresent academic outcomes data – whether public research-based on internal research – and essentially take credit for their technology “delivering results”.
This is the misuse of someone else’s data for corporate gain. Institutional data. Student data. That is far different than using overly-positive descriptions of your own data or subjective observations. That is wrong.
The Offer
For D2L company officials, I have an offer.
- If you have answers or even corrections about these issues, please let us know through your own blog post or comments to this blog.
- If you find any mistakes in my analysis, I will write a correction post.
- We are happy to publish any reply you make here on e-Literate.
- Their web page does not allow archiving with the Wayback Machine, but I captured screenshots in anticipation of this move. [↩]
- Note – While I assume this claim derives from U-Pace, I am not sure. It is the closest example of real data that I could find, thanks to a helpful tip from UW-M staff. I’ll give D2L the benefit of the doubt despite their lack of reference. [↩]
- And really, D2L marketing staff should learn how to link to external sources. It’s good Internet practice. In the real world we have street marketing on Pole Banner Hardware serving this purpose. [↩]
Andrew McCann says
I’ve always cared deeply about integrity in data analysis. I think the standards are pretty low…even though you couldn’t pick a better leading indicator of critical thinking ability (and the basics of a good college-level education) than parsing this sort of argument.
I left General Electric (in my 20s) partly from frustration that senior management had little ability in this area (Six Sigma etc). Some chart would tick upwards (good!) for one week and happy emails would make the rounds…a downward tick for a week would result in investigative tribunals. Neither up nor down movement was likely statistically relevant.
I’ve also felt the tug of subtly biasing data…and tried to resist.
The empty claims made by D2L and others have little impact, I hope. But they do validate sloppy analysis.
I’m equally concerned with the many, many conference presentations I’ve sat through where presenters make claims based on excruciatingly small sample sizes or shoddy statistics – the kinds of analysis that a college senior should knock out of the park, never-mind a PhD.
I’d like to see edtech journalists step up the criticism of this behavior – across the board.
It’s a self-serving request, since such scrutiny will only help to highlight those of us (companies and educators) attempting rigor in our efficacy work. Anyone attending a regional accreditation conference or an assessment track at BbWorld or Sloan or Campus Tech or AACSB will find plenty of material.