I often get asked what themes I see coming out of EDUCAUSE. The last few years, the theme has been “there is no theme,” perhaps reflecting the fact that ed tech hype has been in remission. With the exception of the OPM product category, whose hype cycle from zero to peak to trough may have been the fastest we’ve seen yet, the market seems tired of the endlessly repeating roller coaster ride of hope and disappointment.
This year was a little different. Overall, the conference was the healthiest that I’ve seen it in a while. Attendance seemed to be up. The balance between IT and learning presentations seemed good. There were more vendors on the floor, with more big booths (but no magicians, jugglers, or sword swallowers, thankfully). Overall, the organization seems to be undergoing a revitalization under John O’Brien.
Within that big picture, there was some color. The major textbook publishers had very little presence, perhaps reflecting their current financial state or perhaps indicating that they’ve figured out the EDUCAUSE crowd isn’t really their target audience, for the most part. On the other hand, ERP/SIS vendors were back with bigger booths, including some names I didn’t recognize (like Unit4, which apparently has been around for a while but which hadn’t heard of before this year).
There also was a proliferation of small vendors with various flavors of solutions aimed at improving student retention and graduation rates. Each one had its own take on the problem. One had an integration dashboard, pulling in data from the LMS, SIS, and other applications to tell advisors when students might be struggling and need encouragement. Another took a similar approach but focused on giving students nudges directly. Yet another focused on engaging students on social media. And so on.
Whenever I saw one of these companies, I tried an experiment. I’d do a quick scan of their booth posters and video screens. Just enough to get a very rough sense of what they claimed to do. In other words, I looked about as carefully as the typical exhibition floor browser looks. Then I would approach the booth and say, “I have a rough sense of what your product does from what I see on your booth.” I have one question for you: How do you know that it works?”
I got a range of different answers.
“Our customers love it.”
Bzzt. Wrong answer.
“We have dashboards that show 67 points where students could be getting bogged down.”
“OK, but how do you know that it works? I mean, have you tested it to see if it actually impacts outcomes?”
“…Well…we have a white paper.”
Pass.
“We have a customer who improved their retention by 60%! But to be truthful, our product was one of a number of initiatives they put into place, so I don’t know how much of that improvement they would attribute to it.”
Close, but no cigar. Points for honesty, though.
“We have conducted multiple randomized controlled trials.”
“Oh, really! What result were you testing against?”
“Oh, right, of course. Year-to-year retention…”
“Thanks, you can stop there for now. May I please have your business card?”
I found one—just one—vendor who appears to be able to deliver the goods on evidence. I wouldn’t have picked up on that from my cursory scan of their booth. They didn’t look any different than any of the others I had spoken to. In fact, if you had asked me which of the products I looked at would have been likely to prove out as effective, I wouldn’t have ranked them near the top of the list. The trick that they advertise most heavily, which is that they repurpose existing advertising mechanisms on popular social media platforms to deliver productive nudges, seems clever but a little disconnected from problems specific to retention. While they mention “behavioral science” on their web site, they don’t provide a lot of detail. That said, if you read their marketing copy carefully, you’ll see that they do provide a rather stripped-down, easy-to-follow explanation of a randomized controlled trial that they conducted.
The name of that company, by the way, is Motimatic. Have you heard of them? Because I haven’t. I’m going to look into them and find out more. Maybe you should too.
Right now, higher education has very poor signals to quickly distinguish between those vendors who can prove that their products are effective and those that can’t. Between those vendors who are contributing to the general state of knowledge about how to improve student success and those that aren’t. Until we can improve the signal-to-noise ratio, we are doomed to ride the roller coaster from hell that is the hype cycle until we pass out from exhaustion.
I’m going to do something about improving the signal-to-noise ratio, and I will be announcing what that something is within the next week.
Stay tuned.
Gavin Henrick says
Over the years, I have seen some challenges for student success / retention approaches, getting ethical or privacy approval to do decent testing.
The second challenge I have seen with most is getting access to data from multiple different systems needed to have a better understanding.
The third challenge is the research needing to have a longitudinal basis to test some of the benefits so taking time
They can be challenges for a number of different ed tech solutions. Not so easy for startups / NewCos.
Wayne Parkins says
“We have a customer who improved their retention by 60%! But to be truthful, our product was one of a number of initiatives they put into place, so I don’t know how much of that improvement they would attribute to it.”
As you stated, loved the honesty. That sentence could be from any of the so called established retention players – Civitas, Starfish, et.al.