In a recent post, I wrote about my experience on the EDUCAUSE exhibition hall looking for vendors with proof that their products actually help students and how this is an example of the more general problem:
Right now, higher education has very poor signals to quickly distinguish between those vendors who can prove that their products are effective and those that can’t. Between those vendors who are contributing to the general state of knowledge about how to improve student success and those that aren’t. Until we can improve the signal-to-noise ratio, we are doomed to ride the roller coaster from hell that is the hype cycle until we pass out from exhaustion.
I’m going to do something about improving the signal-to-noise ratio, and I will be announcing what that something is within the next week.
Today I am pleased to announce the Standard of Proof ongoing webinar series, which attempts to do just that. The first is scheduled for November. But the plan is to have at least one a month starting in late January, after folks have returned from winter break.
How webinar topics are selected
For many years now, I have turned down requests to facilitate paid webinars because it’s very hard to do them without sliding down a slippery slope and either appearing to be a shill or actually crossing over the line and becoming one. And yet, I would like to give webinars that highlight genuinely good work, and I would like to get paid for my time so that I can afford to do this work.
The Empirical Educator Project (EEP) has given me a novel way to thread the needle. Vendors don’t pay me directly for a webinar. Instead, they sponsor EEP. I vet prospective EEP sponsors—they have to have something to contribute beyond just money—and I also price sponsorship to incentivize active participation and contribution. Standard of Proof webinars are now another incentive for vendors to contribute.
A Standard of Proof webinar is not an automatic benefit of EEP sponsorship. Each webinar is about not a company but a contribution to the work of empirical education. It can be, and often is, novel research conducted by the company that (a) contributes to our knowledge of how to help students succeed, and (b) adds to our knowledge in a way that is generalizable beyond their product or service. In other words, it is a contribution to the commons. But the contribution can also be a project that helps disseminate established research-backed practices or generally promote the culture of empirical education. A contribution should either be free if it has no maintenance costs or non-profit if it has does have a cost to maintain.
Contributions also have to have been developed with some meaningful collaboration with or peer review from academics. As with the definition of “contribution” above, I am more interested in the spirit of meaningful collaboration than in setting up narrow rules. But there has to be an academic who was involved enough in the project that she or he is willing to speak publicly about their participation and their views of the project.
While EEP sponsors are not guaranteed a minimum number of webinars in return for their sponsorship, neither are they limited to a maximum number. If they have made multiple contributions and I have time to cover them in the Standard of Proof schedule, I will. If that means running more than one webinar a month, I would be happy to have that problem.
The end goal here is to help vendors that meaningfully contribute to education differentiate based on those contributions. Providing credible evidence that their solutions support student success is part of that work, but the bar is higher than that. The webinar series title, “Standard of Proof,” refers not only to the products but to the vendors. The series highlights acts of good citizenship that promote empirical education. By raising the profile of these acts and the vendors who perform them, I hope to influence the criteria by which prospective vendors are selected and therefore their incentives for behavior.
Folks, this only works if you and your institutions decide that you will prioritize good behavior—including but not limited to credible proof that a product is effective—in your vendor selection. The most that I can do is help make that good behavior a little more visible.
The first webinar
As I wrote earlier in this post, the first webinar coming up later this month. I have two more lined up for early next year that I will be announcing at a later date and more in the pipeline.
Summer melt: A randomized controlled trial
We’ll be kicking off the series a randomized controlled trial by AdmitHub and its academic partners to advance our understanding of a critical phenomenon that influences the success of first-generation students.
Folks involved with admissions at access-oriented schools may be familiar with the term “summer melt.” It’s when students are admitted to a university, say they are coming, and then never show up. And it tends to be a first-generation student problem. There are all kinds of logistical and bureaucratic hurdles in getting from admissions to first day of classes that may be harder for first-generation students, either because the processes are not designed to accommodate their situations or because they don’t have anybody in their lives who can help them. Maybe they need an immunization certification and they don’t know how to get it. Maybe they need the signature of a parent who they don’t know how to contact. This is a problem that is often caused by a number of small contributing factors, and it is hard to diagnose because the people who know what kept them from coming to college…never came to college. So having a dialogue with them after the fact is difficult to arrange (not to mention unhelpful for the student, who has already missed an opportunity).
There are a number of products on the market, particularly machine learning-enhanced chatbots, that purport to address the summer melt problem. AdmitHub is one of them. But rather than just claim that they can help, they collaborated with their major customer, Georgia State University (GSU), and an academic researcher from the University of Pittsburgh who studies summer melt, to conduct a rigorous study of the impact of the AdmitHub-enabled interventions on summer melt at GSU. The results of the study? GSU was able to reduce summer melt by 20% using AdmitHub.
In addition to providing credible evidence for the product’s effectiveness, this study’s results and experimental design both contribute to our general knowledge of summer melt. We’ll be talking about that as part of the webinar, which will include the PI Lindsay Page from Pitt as well as Tim Renick from GSU, who happens to be a personal hero of mine.