For starters, I got more clarity on the $1 billion technology investment that we keep hearing about. That’s not just for their LMS. It’s basically their entire learning- and learner-focused technology portfolio. If you think about the strong shift to online education that’s been happening in the for-profit education sector and the fact that Apollo serves over 300,000 students, it makes sense that they would need to make a massive investment in modernizing their IT infrastructure, including the LMS, but also their registrar software, their student/customer tracking software, their data centers—pretty much everything from soup to nuts. A billion dollars is still an impressive amount of money for a company to invest in just about anything, but to put it into perspective, Apollo makes about $4 billion in annual revenues. So an investment of 25% of that over a few years to upgrade their entire mission-critical IT infrastructure sounds forward-looking but reasonable.
We also talked a little bit about Apollo’s learning analytics. Wrubel identified a number of different layers. The first layer is their persistence and retention early warning system. This is the kind of learning analytics that is most widely adopted and well understood. (I have written about Purdue’s Course Signals as an example of this kind of system.) The second layer is cohort matching. Apollo places students into twelve- to eighteen-person learning cohorts. As you might imagine, getting the right mix of students in terms of their skills and abilities can be critical to the success of the cohort. The company has invested in analytics that help them get the right groups together. (By the way, this is one frontier that I think the MOOCs are going to eventually going to have to tackle. I suspect that one reason Coursera’s peer review functionality has gotten panned, despite a growing body of literature that calibrated peer review can work, is that they have vastly more heterogeneous groups than a traditional university class setting.)
The third layer is tracking student progress across the curriculum, looking for the drop-off points. In a way, this is a complement to the retention early warning analytics, but looking at it from a perspective of finding the rough patches that are likely to cause students trouble and sanding them down, as opposed to finding students who are in rough patches and helping them through. Wrubel put a lot of emphasis on completion, and specifically contrasted that emphasis to the high drop-out rate that we see in MOOCs. He also talked particularly about what he called “foundational learners,” which I suppose is a euphemism for remedial learners. “Foundational learners just carry more risk factors coming into the risk factors,” he said. “We have done a lot in this area.” He talked about providing many small doses of remediation over time, as opposed to pulling a student out for an eight-week block of remediation.
Next, he talked about using the technologies and approaches of Carnegie Learning to improve outcomes. I have written previously about the powerful techniques being developed at Carnegie Mellon University and the University of Pittsburgh to identify skill ladders in learning a particular subject and remediating students as they work their way up those skill ladders. Apollo acquired Carnegie Learning, a commercial spin-off that develops courseware based on this approach. The education giant clearly has big plans for leveraging their acquisition. Wrubel really emphasized the value that Carnegie brings to the table. He scoffed at using Google-like tricks to personalize learning through big data magic (which is very much in line with my recent critique). Instead, he talked about automating the thus-far labor-intensive process of discovering skill maps for different subjects and disciplines. That’s a pretty ambitions and important research project, and I will be curious to see what they are able to accomplish.
Finally, he talked a little bit about that activity stream work (and patent) that Phil wrote about recently. We were at the very end of our time, so I didn’t get as much on this as I would have liked, but the gist is that the innovation is not so much on the idea of an activity stream as it is figuring out which bits of activity stream data are important to which stakeholders. What do the students need to see? How about the teachers? How about the analytics systems? At this point I’m extrapolating (and speculating) from a few remarks, but my sense is that the technology may be more properly thought of as a data bus than as a Facebook-like interface. “Activity stream” should be thought of in this context as the underlying data structures that are being routed, filtered and prioritized, but it’s the verbs in the sentence rather than the nouns that Apollo seems to be emphasizing.
I am hoping to have some follow-up conversations with Rob and other Apollo stakeholders and will let you know what I learn.
Robert E. Dratwa says
Great insight, Michael, into some of the initiatives that make the industry leader just that!
phxstudent says
The next time you have the opportunity you should ask Mr. Wruble for some specifics on where Phoenix’s investments are actually widely deployed. If you’re an online student at Phoenix you are using the same basic platform they’ve had in place the past six or seven years. They launched a social platform a few years ago (PhoenixConnect) that has yet to garner any real interest from students. There’s all this talk about a new learning platform but it has remained in “pilot” for the past two years servicing only a couple thousand of Phoenix’s 350K students. Phoenix may well be spending hundreds of millions in new technology but they should be asked why so little of it has trickled out to its students.