As part of our e-Literate TV set of case studies on personalized learning, Michael and I were fully aware that Arizona State University (ASU) was likely to generate the most controversy due to ASU’s aggressive changes to the concept of a modern research university. As we described in this introductory blog post:
Which is one reason why we’re pretty excited about the release of the first two case studies in our new e-Literate TV series on the trend of so-called “personalized learning.” We see the series as primarily an exercise in journalism. We tried not to hold onto any hypothesis too tightly going in, and we committed to reporting on whatever we found, good or bad. We did look for schools that were being thoughtful about what they were trying to do and worked with them cooperatively, so it was not the kind of journalism that was likely to result in an exposé. We went in search of the current state of the art as practiced in real classrooms, whatever that turned out to be and however well it is working.
As part of the back-and-forth discussions with the ASU case study release, John Warner brought up a good point in response to my description that our goal was “Basically to expose, let you form own opinions”.
@PhilOnEdTech Can't form opinion without a more thorough accounting. Ex. How did you choose students and fac. to talk to?
— John Warner (@biblioracle) June 1, 2015
Can’t form opinion without a more thorough accounting. Ex. How did you choose students and fac. to talk to?
Let’s explore this subject for the four case studies already released. Because the majority of interviewees shared positive experiences in our case studies, I’ll highlight some of the skeptical, negative or cautionary views that were captured in these case studies.
Our Approach To Lining Up Interviews
When we contacted schools to line up interviews on campus, it is natural to expect that the staff will tend to find the most positive examples of courses, faculty and students to share. As described above, we admit that we looked for schools with thoughtful approaches (and therefore courses), but we needed to try and expose some contrary or negative views as well. This is not to play gotcha journalism nor to create a false impression of equally good / equally bad perspectives. But it is important to capture that not everyone is pleased with the changes, and these skeptics are a good source of exposing risks and issues to watch. Below is the key section of the email sent to each school we visited.
The Case Study Filming Process
Each case study will include a couple of parts. First, we will interview the college leadership—whoever the school deems appropriate—to provide an overview of the school, it’s mission and history, it’s student body, and how “personalized education” (however that school defines the term) fits into that picture. If there are particular technology-driven initiatives related to personalized learning, then we may talk about those a bit. Second, we will want to talk some teachers and students, probably in a mixed group. We want to get some sample reactions from them about what they think is valuable about the education they get (or provide) at the school, how “personalization” fits into that, and how, when, and why they use or avoid technology in the pursuit of the educational goals. We’re not trying either to show “best/worst” here or to provide an “official” university position, but rather to present a dialog representing some of the diverse views present on the campus.Campus Input on the Filming
In order for the project to have integrity, MindWires must maintain editorial independence. That said, our goal for the case studies is to show positive examples of campus communities that are authentically engaged in solving difficult educational challenges. We are interested in having the participants talk about both successes and failures, but our purpose in doing so is not to pass judgment on the institution but rather to enable to viewers to learn from the interviewees’ experiences. We are happy to work closely with each institution in selecting the participants and providing a general shape to the conversation. While we maintain editorial control over the final product, if there are portions of the interviews that make the institution uncomfortable then we are open to discussing those issues. As long as the institution is willing to allow an honest reflection of their own challenges and learning experiences as an educational community, then we are more than willing to be sensitive to and respectful of concerns that the end product not portray the institution in a way that might do harm to the very sort of campus community of practice that we are trying to capture and foster with our work.
As an example of what “willing to be sensitive to and respectful of concerns” means in practice, one institution expressed a concern that they did not want their participation in this personalized learning series to be over-interpreted as a full-bore endorsement of pedagogical change by the administration. The school was at the early stages of developing a dialog with faculty on where they want to go with digital education, and the administration did not want to imply that they already knew the direction and answers. We respected this request and took care to not imply any endorsement of direction by the administration.
Below are some notes on how this played out at several campuses.
Middlebury College
As described in our introductory blog post:
Middlebury College, the first school we went to when we started filming, was not taking part in any cross-institutional (or even institutional) effort to pilot personalized learning technologies and not the kind of school that is typically associated the “personalized learning” software craze. Which is exactly why we wanted to start there. When most Americans think of the best example of a personalized college education, they probably think of an elite New England liberal arts college with a student/teacher ratio of under nine to one. We wanted to go to Middlebury because we wanted a baseline for comparison. We were also curious about just what such schools are thinking about and doing with educational technologies.
Middlebury College staff helped identify one faculty member who is experimenting with technology use in his class with some interesting student feedback, which we highlighted in Middlebury Episode 2. They also found two faculty members for a panel discussion along with two students who have previously expressed strong opinions on where technology does and does not fit in their education. The panel discussion was highlighted in Middlebury Episode 3.
As this case study did not have a strong focus on a technology-enabled program, we did not push the issue of finding skeptical faculty or students and instead exposed that technology was not missing from the campus consideration of how to improve education.
The administration did express some cautionary notes on the use of technology to support “personalized learning” as captured in this segment:
Essex County College
By way of contrast, our second case study was at Essex County College, an urban community college in Newark, New Jersey. This school has invested approximately $1.2 million of its own money along with a $100 thousand Gates Foundation grant to implement an adaptive learning remedial math course designed around self-regulated learning. Our case study centered on this program specifically.
Of course, the place where you really expect to see a wide range of incoming skills and quality of previous education is in public colleges and universities, and at community colleges in particular. At Essex County College, 85% of incoming students start in the lowest level developmental math course. But that statistic glosses over a critical factor, which is there is a huge range of skills and abilities within that 85%. Some students enter almost ready for the next level, just needing to brush up on a few skills, while others come in with math skills at the fourth grade level. On top of that, students come in with a wide range of metacognitive skills. Some of them have not yet learned how to learn, at least this subject in this context.
Given the controversial nature of using adaptive learning software in a class, we decided to include a larger number of student voices in this case study. Douglas Walcerz, the faculty and staff member who designed the course, gave us direct access to the entire class. We actively solicited students to participate in interviews, as one class day was turned over to e-Literate TV video production and interviews, with the rest of the class watching their peers describe their experiences.
As we did the interviews, almost all students had a very positive view of the new class design, particularly the self-regulated learning aspect with the resultant empowerment they felt. What was missing was student voices who were not comfortable with the new approach. For the second day we actively solicited students who could provide a negative view. The result was shared in this interview:
As for faculty, it was easier to find some skeptical or cautionary voices, which we highlighted here.
As described above, our intent was not to present a false balance but rather to to include diverse viewpoints to help other schools know the issues to explore.
Arizona State University
At ASU we focused on two courses in particular, Habitable Worlds highlighted in episode 2 and remedial math (MAT 110) using Khan Academy software highlighted in episode 3.
We did have some difficulty getting on-campus student interviews due to both of these being online courses. For MAT 110 we did get find one student who expressed both positive and negative views on the approach, as shown in this episode.
Empire State College
Like ASU, Empire State College presented a challenge for on-campus video production from the nature of all-online courses. We worked with ESC staff to get students lined up for interviews, with the best stories coming from the prior learning affects on students.
It was easier and more relevant to explore the different perspectives on personalized learning from faculty and staff themselves, as evidenced by the following interview. ESC offered him up–proudly–knowing that he would be an independent voice. They understood what we meant in that email and were not afraid to show the tensions they are wrestling with on-camera. Not every administration will be as brave as ESC’s, but we are finding that spirit to be the norm rather than the exception.
Upcoming Episodes
It’s also worth pointing out the role of selecting colleges in the first place, which is not just about diversity. We know that different schools are going to have different perspectives, and we pick them carefully to set up a kind of implicit dialog. We know, for example, that ASU is going to give a full-throated endorsement of personalized learning software used to scale. So we balance them against Empire State College, which has always been about one-on-one mentoring in their design.
Hopefully this description of our process will help people like John Warner who need more information before forming their own opinion. At the least, consider this further documentation of the process. We are planning to release one additional case studies – the University of California at Davis in early July – as well as two analysis episodes. We’ll share more information once new episodes are released.