SJSU has announced via an official blog post that they are studying the results of the SJSU Plus program using Udacity.
First, news coverage and much commentary have been based on very preliminary and unanalyzed data from a spring 2013 pilot of three SJSU Plus courses with Udacity. We are currently awaiting a more comprehensive National Science Foundation data analysis and report that will be available in August (spring semester courses ended and final grades were submitted only seven weeks ago). We look forward to discussing these results next month.
I mentioned in my post last night that I believe this is a good move.
How often do we get the chance to review the results of traditional college courses and see institutions publicly study the learning outcomes in order to improve the course effectiveness? This open review is one innovation from the SJSU Plus program that should be extended to other courses.
There are plenty of people who do not share my optimism of the reporting process, partially due to the expectation that we already know most of the lessons learned and there should be no real surprise. Indeed, if you read the news coverage of SJSU pausing the Udacity program, there are several findings that are already fairly obvious.
Findings to Date
The following summary is based on several articles and blogs directly based on first-hand experience.
Finding 1) The program was rushed, and the courses were still being developed while students started the courses:
IHE: But, because of the haste, faculty were building the courses on the fly. Not only was this a “recipe for insanity,” Junn said, but faculty did not have a lot of time to watch how students were doing in the courses because the faculty were busy trying to finish them. It took about 400 hours to build a course, though the courses are designed to be reused.
Edsurge: The professors creating the curriculum for the program didn’t have much time; they were still writing curriculum when the courses began. “It was really hard on the faculty,” Thrun says.
Finding 2) The course designs did not adequately consider schedules, deadlines and pacing:
Edsurge: Then the unanticipated problems started to crop up. When the courses started, two of the three classes didn’t give students precise deadlines for assignments. “We communicated our expectations poorly,” concedes Thrun. “We had two deadline-free courses. Especially in these classes, students fell behind. That was a mistake,” Thrun declares.
Edsurge: All that said, when students were surveyed, they said the biggest impediment to succeeding in the class was pacing: they just didn’t have enough time. “Sal Khan has strong data that says in math in particular, a more flexible pacing is important for success,” Thrun says. “He’s been preaching go at your own pace and you can turn a C-level student to an A-level student.” Thrun also pointed to Foothill College’s “Math My Way” program which has been able to double its student-pass rates by giving students more time.
Chronicle: “The No. 1 complaint we’re getting is that students need more time, they feel rushed,” said Mr. Thrun.
Finding 3) There was no self-selection support to help students determine if they could succeed in this environment:
Edsurge: The students selected were not “typical” community college students: in many cases, they had already failed a remedial class or a college entrance exam.
IHE: Another factor in the disappointing outcomes may have been the students themselves. The courses included at-risk students, high school students and San Jose State students who had already failed a remedial math course. “We stacked the deck against ourselves,” Junn said of the Udacity partnership.
SJSU blog: Third, it is important to note that at the outset, SJSU made a commitment to working with “at risk” students – many from disadvantaged economic backgrounds; high school students; and students of our own who had struggled with the curriculum (including many who had failed remedial math courses in the past). Without question, these and other factors significantly affect student performance outcomes.
Finding 4) There was not adequate support structure for students, in terms of mentoring and computer access, which is even more crucial for remedial students:
Edsurge: That push helped in another way: initially many students were unaware of the online tutors (who are real people) who were available online to help, 12 hours a day. But over the weeks, it became clear that the tutoring services were crucial. “The mentoring program was absolutely essential for every student’s outcome,” Thrun says. [snip] Those tutors were so important because many students lacked even elementary-school-level mathematics knowledge, Thrun says. Among the frequently heard questions: How to divide two numbers? How can you subtract a bigger number from a smaller one? Tutors answered questions and continued to reach out to encourage students.
Mercury News: About two weeks into San Jose State’s online education experiment at an Oakland charter school, it became clear that something was wrong. Some of the students in the college’s for-credit math courses weren’t even logging on. [snip] It turned out some of the low-income teens didn’t have computers and high-speed Internet connections at home that the online course required.
Finding 5) The learning platforms were not integrated to give a seamless experience:
Edsurge: When students did get to the online programs, even navigating the computer systems could be daunting. One of the questions that tutors were frequently asked was how to do exponential notation on a computer. And although the curriculum was delivered on Udacity’s platform, assessments were delivered via a separate learning management system used by SJSC. Worse: results on one system took about 48 hours to update on the other, says Thrun.
Why is Upcoming Report Valuable?
With this many known issues, which have also been documented in several excellent blog posts, why is it valuable for SJSU to use the external NSF-funded report?
In my opinion, there will still be value in A) releasing the full data set and analysis, and B) getting others to understand key points.
Many specialists might understand why the program failed, but many others do not. Campus leaders and policy makers need to learn many of the lessons that are already known in ed tech / higher ed community, and SJSU Plus will help in this area. Plus, I wouldn’t discount some unexpected findings (e.g. need for self-pacing in math courses as mentioned in student surveys).
In a post from last December I described some of the change dynamics that we’ll face with MOOCs, and I see the SJSU Plus program as an excellent example.
We’re in the chaotic period where system performance is fluctuating wildly, and in many cases the changes brought by MOOCs and other forms of online education actually are harming the output. There are some wins, there are some losses. [snip]
The conversations and decisions about online education are often being driven from presidential cabinets and boardrooms rather than just among specialists. This changing conversation will be frustrating for those who have expertise and deep knowledge in online, but it will also be an opportunity to influence changes that were not possible before.
It is important for some of the new participants (and I would add foundations, state governments, university presidents, etc) to fully understand what worked and what didn’t work in this application of online education. The official report from SJSU will provide a valuable service to help with this learning process.
There is also value in this very public case establishing a precedent of pausing pilots that don’t work, evaluating results early in the process, adjusting the course and support design based on findings, and focusing on student learning outcomes as the primary measure of success.
Jon K. says
Considering that many students did not have home computers – I wonder if there was an aspect of technical aptitude that also factored into the lack of success.
As for the “more time” problem, wouldn’t it be great if a student could self-select an end date where they wanted to complete a course, then have the computer adjust where the assessment milestones should be completed by?
Phil Hill says
Jon, I think you’re right – there were technical aptitude as well as basic math aptitude problems here. That is part of the reason that this student population needs extra support that is well-design, when SJSU seems to have ignored the support issue other than relying on Udacity mentors on the phone.
I love the idea of a self-select pacing that would auto-adjust the milestones, especially for math. Great point.