Alternate Headline: “Our Long National Nightmare is Over – SJSU and Udacity solve problem of college graduates being able to pass remedial math”
The more I read on SJSU’s announcement on the pilot program, the more troubled I am with the lack of clear description of student population change (I wrote briefly about the change in student populations yesterday). In a nutshell, the spring 2013 pilot was completely different in the major demographic variables than the summer 2013 pilot. That’s good, right, showing that SJSU and Udacity are learning their lessons? It would be good if SJSU clearly described the student differences and avoided any implications that the numbers could be compared. Further, it would be good to avoid misleading comparisons to face-to-face courses at SJSU.
But that is not what is going on. SJSU, in particular, is going out of its way to compare spring, summer pilots alongside SJSU on-campus courses in its media blitz. And the strategy is working, based on the articles that came directly from SJSU / Udacity interviews and information releases.
Inside Higher Ed: university officials on Wednesday touted results from the summer cohort as “significantly better”
Chronicle: But now the pilot program appears to be back on course, buoyed by encouraging data from this summer’s trials
TechCrunch: But the university and its platform partner, Udacity, bounced back on their second try, improving students’ outcomes… [snip] Turns out, the failure was premature.
More distressing is that SJSU and Udacity have put out a table, used by most media outlets, that shows direct comparisons.
Here’s the trouble which I described yesterday. The student populations between these three groups are completely different, to the point where other comparisons, such as passing rates or completion rates, should not be made.
Below is my summary of the student demographics based on various interviews and articles.
That’s right – the summer pilot includes 53% of students already having a college degree, 48% with a bachelor’s or higher. In the spring, none of the students had a college degree.
Note: there are conflicting reports on the spring pilot demographics. Most accounts show that it was approximately 50% active high school students (many from Oakland) and 50% matriculated SJSU or CSU students. The Wall Street Journal, however, lists the totals as 20% active high school students. What is troubling is that all of these accounts are based on SJSU or Udacity interviews. I have chosen to use the 50% numbers, for two reasons:
- Udacity lists these numbers (50% high school, 50% SJSU) for spring, and Udacity is the holder of the data.
- The actual contract documents called for 50 / 50 split with SJSU students (courtesy Ry Rivard at IHE):
In these initial three Courses, each section per Course will have 50 students, for a total of 100 students enrolled for-credit, not including unlimited non-credit students as described in Section 2.2. Half of the for-credit students will be matriculated University students (50 in each Course); the other half will be non-University students (50 students in each Course).
Need More Data
Furthermore, we don’t know the breakdown per course. The remedial math course has the worst pass rates, but does this course have a higher percentage of high school vs. college vs. college graduate for either spring or summer? We have no idea.
With the dramatically different student populations, we also need to know who completed vs. dropped out, who passed (C or above) and who failed.
The only viable comparison across all three groups would be for matriculated SJSU or CSU students. That comparison might tell us a lot.
Effect of Credit and Fees and Proctored Exam
And there is another key issue with this program – it is one of the first attempts to allow credit for a MOOC-style course (although not massive in spring terms). The idea is that for-credit students would pay $150 per course, and if students get a C or above through a proctored exam, they would get academic credit at a CSU campus. This is a bold program pushing the envelope. How does the potential for academic credit affect student performance in a MOOC? How does the $150 (skin in the game) affect student performance, even if the National Science Foundation covered the fees for the spring pilot?
And one other big question to consider: with the opening of enrollment between spring and summer, going from 300 easy-to-identify students to 2091 students mostly out of state or country, did all of the summer students take a proctored exam?
Change in Retention Rate
We also see that SJSU changed the definition of retention rate, from their post:
The overall retention rate dropped to 60 percent this summer, compared with 83 percent this spring, reflecting SJSU’s decision to be more flexible when students signaled to instructors that they needed to drop the course.
Clearly SJSU allowed more course drops (we don’t know what the policy change was), but in a standard course, once a student drops they are not counted in overall pass rates. So this policy change would change the pass rates, making them seem higher than they actually are. This was noted in the IHE article yesterday.
While student performance is up, the retention rate dropped from 83 percent this spring to 60 percent over the summer, which Taiz [president of the California Faculty Association] said may have inflated the pass rates, as students who would have received a poor grade in a course instead decided to drop it. In comparison, data provided by SJSU showed similar on-campus classes have retained no less than 94.3 percent of students since the 2010 spring semester.
The Biggest Offender: Official SJSU Post
Ironically (or depressingly), the best information comes from Udacity and not from SJSU. Inside Higher Ed, the Chronicle and even TechCrunch have much better descriptions of the student population differences than does SJSU.The only reference in the SJSU official announcement to student demographic changes are these nuggets:
This summer, 89 percent of our SJSU Plus students were not California State University students. [snip]
Over the summer, there were many comparisons made between our SJSU Plus and face-to-face courses. What many people failed to realize is this was not an apples-to-apples comparison.
The announcement then goes further to actually call out lessons learned:
Meanwhile, we would like to share some lessons learned.
Here’s what worked:
Learning by doing works. Online video allows us to stop every few minutes and offer students the opportunity to try what they’ve learned with an online exercise. Instructors have found this so effective that some are incorporating SJSU Plus materials into their campus-based courses.
Student interaction remains strong. Does online learning stifle conversation? We found the opposite. Students are connecting with each other, instructors and instructional assistants through every means available: text, email, phone calls, chats and meetings.
Here’s where we’ve improved:
Students need help preparing for class. With SJSU Plus reaching well beyond the SJSU campus, we are enrolling a growing number of students who are unfamiliar with the demands of college courses. This summer, 89 percent of our SJSU Plus students were not California State University students. So SJSU Plus now offers orientation in various forms in all five courses.
Students need help keeping up. Everyone needs a little encouragement to stay on track. So we’ve added tools that help students gauge their progress and we’re checking in with individual students more often.
We need to communicate better with students. Although SJSU and Udacity try to be as clear as possible with our online instruction, we know we can do better. Student feedback has been immensely helpful in refining SJSU Plus materials. We’re also sending less email and more messages while students are “in class” online.
These findings may have some merit (and in fact should have been understood before designing the courses), but it is premature to declare lessons learned unless the student population is taken into account.
Update: Fixed minor wording in first two paragraphs for clarity; no change in meaning.
mikecaulfield says
And how depressing that this rampant innumeracy is about a math class? I absolutely agree that the level of information and transparency has been appalling and the stenography of the education press even worse. First lessons of stats — Was like compared to like? Not rocket science.
I *do* wonder the extent to which working with private entities creates a situation where we can’t talk about these things in any real way. Every institution that gets into these sort of agreements mentions the research mission, and how experimentation is a core value. But then they refuse to take critical looks at what is going wrong. All this money going into these experiments, but leading to the degradation of educational scholarship. Ugh.
Phil Hill says
Mike, good point about the math class – I had missed that irony.
While I think these types of partnership agreements can be tricky, I would note that the best source of information on this subject has been Udacity’s blog. It may be possible that there is pressure behind the scenes to be careful with data, but on the surface Udacity does not appear to be holding back data release. But shouldn’t it be the university itself that is most insistent on accurate portrayal of numbers?