Carnegie Mellon University and Duke University have shared newly available free tools that will significantly lower the barriers to conducting ethical educational research. The two universities contributed the tools through e-Literate’s Empirical Educator Project (EEP), an effort to promote broader adoption of evidence-based teaching practices and foster a culture of empirical education across higher education.
As with all academic research involving human subjects, educational researchers must have their experimental designs approved by their university’s Institutional Review Board (IRB). If a researcher wants to study students or their work, they must explain how they will get the students’ informed consent to participate.
This can be a major barrier that often prevents research from being undertaken. Teaching faculty who may be interested in conducting a study may decide that the bureaucratic burden is more than they can take on. Multiple universities that want to collaborate on cross-institutional studies will have to get approval from each institution’s IRB in an environment where there are no widely adopted standards for reviewing and approving educational research by these bodies. Educational technology companies that want to be more transparent and collaborative with universities about their own research into product efficacy can find the IRB process impractically time-consuming. As a result, far less educational research gets conducted in ways that are both reviewed for ethical practices and shared as credible research that contributes to the state of the art in learning science.
Through e-Literate’s EEP, learning science researchers at Carnegie Mellon and Duke Universities discovered that each institution had developed a solution for part of this problem. Carnegie Mellon University has developed templates approved by their IRB that they estimate will accommodate approximately 80% of classroom research use cases. Meanwhile, Duke University has developed language and a process approved by their IRB for requesting and tracking informed consent from students.
The two universities have released the tools under a Creative Commons Attribution (CC-BY) license and provided “train the trainer” support for the use of their templates and protocols. Together, these contributions could enable many of the educators and product designers who are already conducting informal educational research all over the world to participate in the same sort of social fabric that has enabled communities of researchers in other human sciences to tackle problems from cancer to Alzheimer’s disease.
Jeff Young has a great piece up about this release at EdSurge, and I believe we will see something from The Chronicle in their teaching newsletter on Thursday. I’d like to give you my own take on the reasons why this an important milestone.
It’s an example of untapped inter-institutional opportunity
Carnegie Mellon and Duke are theoretically peers. To use one very imperfect measure, they are both ranked in the top 25 national universities by US News and World Report. They are both more specifically ranked in the top 15 such schools for undergraduate teaching. Both are seriously concerned with improving undergraduate education through research-based practices. The two institutions have been working on similar yet complementary efforts to make that research easier. You would think that they would have had opportunities to share their work with each other, or at least know about what the other is doing.
Before EEP, they didn’t.
On closer examination, the complementarity of the two efforts suggests the kinds of opportunities that higher education is missing. Carnegie Mellon University has one of the broadest, deepest, most impressive, and most historic learning science research programs on the planet. There are only a handful of universities that are even in their league, and Stanford may be the only university that rivals them in depth and breadth. Further, the university has made a commitment in the form of the Simon Initiative to “[p]rovide accessible tools and methods with which any person or institution can adopt and advance CMU’s approach to learning engineering, improving outcomes for their own learners,” globally, in addition to improving teaching and learning at Carnegie Mellon University itself. (Not that this is distinct from but complementary to their Eberly Center for Teaching Excellence, which is similar in purpose to the centers of teaching and learning at many universities.) To get a flavor for who they are and how they think, here’s a playlist of three e-Literate TV videos that highlight a few of their faculty members:
Now, Carnegie Mellon is known more generally for its engineering prowess, so it’s no surprise that the folks at the Simon Initiative talk about “learning engineering.” (The initiative is named after the late Herb Simon, a cognitive scientist and CMU luminary who coined that term.) That mentality, as well as the orientation of an institution known for its learning science research, shows in their contribution. They studied the IRB applications submitted for educational research by their faculty—not just their learning science faculty, but all faculty interested in publishing their research projects, identified common traits, and developed a template that they estimate covers somewhere in the neighborhood 80% of those projects. They then had their IRB review and approve the template. Now, any educator at CMU who wants to do publishable educational research and can use the template gets their IRB application fast-tracked.
Without taking anything away from Duke’s own learning science research capabilities, one of the things that Duke is among the very best in the world at is making their undergraduates’ educational experience life-changing. As one example, the university has a large number of “professors of practice.” In many big research universities, a tenure track is a kind of death trap. Faculty are worked to the bone for three to five years in the hopes of achieving tenure when, in reality, most of them will be sent packing in the end. And while they are on that treadmill, they have a disincentive to invest too heavily in their teaching lest they neglect the publications and grants that will increase their odds of not being shown the door. That’s not how Duke rolls. Rather than exploiting young faculty by dangling a carrot that will forever be out of reach, they offer many a professor of practice position. Such faculty get basically everything except tenure, including long-term employment, a decent salary scale, and a say in shared governance. But they must not only show that they are excellent educators but also conduct research in education. In effect, rather than “physics” or “art history” being their discipline, it’s “physics education” or “art history education.” I don’t know whether Duke or their faculty would put it quite this way, but I intend it to be a compliment. The point is that they incentivize faculty to become disciplinary experts in supporting their students through evidence-based practices. I have been to one of their teaching and learning conferences and interacted with these faculty members. It was heavily attended and the atmosphere was electric. I have been to many of these kinds of events; yet I have rarely seen faculty who were more actively engaged in asking good, probing questions about teaching practice than I did at Duke.
This different context from CMU’s has resulted in a slightly different approach to the same problem. Like everyone else, Duke’s educators who want to publish their research have to go through IRB approval. And given that Duke has a world-class medical research program, their IRB is both very tough and very focused on privacy concerns. So their Learning Innovation center developed an informed consent tool called WALTer, where WALT stands for “we are learning too.” WALTer sits right inside the LMS for any course in which research is being conducted. Faculty who want to conduct such research are led through a decision tree that produces IRB-approved language for informed consent, based on the conditions of the experiment. Having this form in place helps the instructor get faster approval of the project. And once that approval is in place, WALTer helps ensure that students are given the opportunity to provide their informed consent (or not). Since WALTer has been launched, the Learning Innovation team has seen an increase in the number of faculty expressing interest in conducting educational research and a decrease in the time it takes educational research applications to be approved by Duke’s IRB.
Peanut butter, meet chocolate. CMU’s IRB template and Duke’s informed consent template are completely complementary. But before EEP, the peanut butter was in the fridge and the chocolate was in the drawer. They just didn’t meet.
It’s an example of untapped intra-institutional opportunity
The Duke feedback that they are getting more educational research interest and getting that research approved faster is illustrative of a larger point. The current institutional structures and processes of colleges and universities are not designed to facilitate the development, testing, sharing, and adoption of evidence-backed practices which improve student success. The IRB process is just one (very painful) example. The disincentive for faculty on the tenure treadmill is another. There are many more.
The question that I have heard asked ad nauseam for years and years now, from a wide range of people, is “How can we make faculty care more about teaching?” That is the wrong question on every level. A better question would be, “How can we design universities such that focusing on improving excellence at supporting student success is less painful and more rewarding?”
Think about just about every hot ed tech-related trend you can think of. Retention early warning analytics. Adaptive learning. Competency-based education. Stackable credentials. MOOCs. All this activity (and money) is swirling around the problem of making it easier for students to learn and succeed at school. Now think about all the hot trends that focus on making it practical and rewarding for faculty to focus their energy and considerable intellectual talents on solving the same problem. Can you name one?
An IRB form for educational research may sound like a small and boring thing, but it is a piece of cultural and institutional infrastructure that makes it a little more practical for motivated faculty to focus energy and intellectual talent on learning how to better support student success. “We Are Learning Too” indeed. This is the kind of work that will make college education better and, in the process, make all of those ed tech tools more useful. A laser scalpel doesn’t do much to promote health unless it’s wielded by a physician who knows how, when, and why to use it.1 Without that knowledge, it risks doing more harm than good.
It’s an example of untapped multi-institutional and institution/vendor collaboration opportunity
So yay for Carnegie Mellon and Duke. What about everyone else?
Well, for starters, they have both contributed their language under a Creative Commons license. (Duke’s WALTer tool is built using a third-party proprietary platform, so contributing the source code wasn’t an option. But somebody else could easily build a tool that supports the workflow in the release document.) So there’s that.
IRBs are notoriously idiosyncratic. (Some might say arbitrary. I’m not saying that. But some might.) So you could see an IRB at an institution that’s very different from CMU or Duke having something like the following argument:
IRB Member #1: Hey, CMU and Duke are super-rigorous research institutions that are way more focused on these sorts of things than we are. If it’s good enough for them, it should be good enough for us.
IRB Member #2: Actually, exactly because CMU and Duke are super-rigorous research institutions that are way more focused on these sorts of things than we are, what makes you think that what makes sense for them will also make sense for us?
These are both reasonable starting positions. The conversation that should flow from this is an examination of the templates in which an IRB member who wants to change a part of it will need to provide a justification for doing so. This is exactly the next step we want to foster for this project, preferably at multiple institutions. Where we’d like to end up is at a toolkit in which an institution of any type can look at variations—and justifications for those variations—provided by peer institutions so that they adapt and adopt them. We’d also like to supplement what we already have with some more fleshed out student data privacy guidelines that are, once again, education-appropriate. Data privacy in this sort of research is just as important as it is in, say, medical research. But the specific concerns and methods for dealing with them aren’t necessarily identical. We should be developing a sector-wide consensus on ethical practices that can augment what is already in the Duke and CMU IRB contributions.
This would hopefully help to lower one barrier to conducting educational research everywhere. But it could potentially do much more than that. Under the law, if researchers want to conduct a multi-institutional research project, then each institution’s IRB must approve the project. And since no two IRBs use the same standards and most don’t have particular guidelines for educational research, doing research across two institutions is more than twice as hard. Three is more than three times as hard. Doing large-scale, multi-institution research quickly becomes impossible in most scenarios. Unless you’re a vendor, in which case it is trivial, because you are not required to go through IRB for your own research—as long as you don’t publish it in a journal. So vendors can conduct research on students in multiple institutions for their own proprietary purposes easily, but if they want to do the right thing by going through IRB approval and sharing what they’re learning through peer-reviewed journals, it’s way, way harder.
Imagine if it were easier for everyone to do the right thing and submit an application for IRB approval, knowing that they will be going through a streamlined but academically validated process. Imagine the kind of research opportunities that could open up. Now imagine further if there were some technology infrastructure behind this. Imagine if we could track IRB approval and informed consent across institutions, gather appropriately anonymized student data, and share it among researchers in a repository that is designed to respect the student privacy constraints dictated by the approved IRB applications while giving more researchers access to more research-relevant data—including data that are gathered through student interactions inside vendor tools. Technologically, this is quite practical. The hard part is getting the policy infrastructure solid and widely adopted.
This is what EEP is about. The problem isn’t that higher education is failing to innovate or that professors don’t care about teaching well. The problem is that we are flushing 99% of the existing efforts, potential opportunities, and good intentions down the toilet because we don’t have the cultural institutions and social infrastructure to support and sustain them. But that can change.
Kudos and thanks to the good folks at CMU and Duke. We have more folks working in EEP—from a diverse range of institutions—on a wide range of projects. And we’re still learning how to work together effectively. Expect more to come.
- Is a “laser scalpel” a thing? I may have just made that up. Anyway, you get the point. [↩]