Paul Fain has written a really good, nuanced article at IHE covering the update that Essex County College gave of their developmental math adaptive learning pilot at a recent conference in Washington, DC. We did a two–part case study on ECC in our e-Literate TV series). The headline results are as follows:
- In the first year, the pass rate was worse than in the traditional classes. (The first semester was “disastrous.”)
- This year—the second year—the pass rate is coming closer to the traditional class but is still underperforming.
- The article seems to imply that students who earn a C in the personalized learning class do better than students who earn a C in the traditional class, but the article is not explicit about that.
There is no magic pill. As Phil and I have been saying all along—most recently in my last post, which mentioned ECC’s use of adaptive learning—the software is, at best, an enabler. It’s the work that the students and teachers do around the software that makes the difference. Or not. In ECC’s case, they are trying to implement a pretty radical change in pedagogy with an at-risk population. It’s worth digging into the details.
Let’s start by reviewing the basics of their situation:
- ECC has a 50% pass rate in their lowest level developmental math class, and a 50% pass rate in the next developmental math class up. Since a substantial majority of ECC students place into developmental math, a big part of ECC’s college completion problem can be traced to students failing developmental math.
- ECC believes that a big reason they have a high failure rate is that students come into that class with an incredibly wide range of prior skills and knowledge—wide enough that a traditional lecture-based class would not address the needs of a majority of the students.
- They decided to try a radical change in the way the developmental math course was structured.
- Students would work self-paced on a mastery learning curriculum in labs using McGraw Hill’s ALEKS adaptive learning software. Students could ask each other or the roving instructor for help.
- Students also met with a teacher each week, separately from the lab sessions, to report their progress of the week, assess the success or failure of their learning strategies, and set new strategies and goals for the next week.
So why does ECC think that they are not getting the results that they hoped for? Doug Walercz, ECC’s Vice President for Planning, Research, and Assessment, offered a few observations. From the article:
- “[A]daptive courses provide less “accountability.” That’s because students move through content at different paces and it’s harder to make sure they master concepts by a certain point. ‘There is no classwide mile post.'”
- “[T]he college leaned heavily on graduate students from nearby Rutgers University at Newark and the New Jersey Institute of Technology to teach parts of the adaptive courses during the first year.”
- “’We underestimated the skill that you would need as a teacher to deliver that content,’ he said.”
- “Faculty buy-in has also been a challenge. In adaptive courses, instructors do not give lectures or teach in the traditional format. Instead, they circulate among students who are working on computer-based courseware, offering help when needed, much like tutors. That feels like a job ‘below faculty status’ for some instructors, Walcerz said.”
Putting this all together, here is what I see:
- ECC is starting with an at-risk population, a large portion of which probably has not been taught good meta-cognitive skills or help-seeking behaviors.
- They are putting those students into a curriculum which, whatever its other virtues may be, puts a higher demand on those meta-cognitive and help-seeking behaviors than a traditional class would.
- The burden of addressing that weakness in the course design falls on the faculty. But ECC has been working with untrained and inexperienced adjuncts—in, fact, graduate students—as well as some faculty who may be hostile to the project. (ECC has since moved away from using graduate students, according to the article.)
There may or may not also be problems with the software. For what it’s worth, Walercz seems to think highly of the software and doesn’t believe that it is contributing to the poor results. Personally, I think the problems with the match between the student skills and the course design are sufficient to explain the problem. The kind of burden that a self-paced program like this puts on these students is somewhat analogous to the burden that an online course puts on them. We know that the type of population that would be enrolled in a developmental math course in a community college in Newark, NJ typically does not do well in online courses. The difference is that, in ECC’s design, there actually are faculty there to intervene and coach the students personally. It stands to reason that the quality of that coaching would be a critical success factor.
Does this mean that ECC’s approach was a bad idea? I don’t think so. Differentiated instruction is a logical pedagogical response to a heterogeneous class problem. But it can only work in their environment if they have appropriately skilled, trained, and motivated faculty. ECC made substantial investments in software and facilities, but this result highlights the fact that the critical success factors in many cases will be making a substantial investment in providing faculty with appropriate professional development and a motivating compensation and promotion plan. It sounds like they have come to realize that and are taking some steps in that direction.
Truly effective innovation in education is hard. As Phil likes to stress, it takes both brutal honesty regarding the results and a commitment to iterate when the results are almost inevitably not what we hoped for in the first try. A while back, I blogged about an interesting case study at MSU where they did exactly that with a psychology class. If you read the comments thread in the follow-on post, you’ll see that Mike Caulfield brought up a potentially new insight that the course’s DWF pattern may be related to interactions between the course’s absence policy and the blended format. Course problems (and course successes) can be subtle and hard to tease out.
There. Is. No. Magic. Pill.