Many thanks to George Siemens for pointing out SNAPP: Social Networks Adapting Pedagogical Practice. This is something I’ve been urging various LMS communities to build for seven or eight years (since I was involved with dotLRN). SNAPP extracts information from the discussion boards in your LMS course and provides social graphs that help you to visualize interactivity.
It works with Blackboard, ANGEL, Desire2Learn, and Moodle (but not Sakai). The mechanism by which it works isn’t entirely clear from the project site, but it looks like you use a bookmarklet to call a web service hosted elsewhere while browsing the discussion forum you want to analyze.
There’s quite a bit that you can do with a tool like this. Here are some applications that the SNAPP site suggests:
- identify disconnected (at risk) students;
- identify key information brokers within your class;
- identify potentially high and low performing students so you can plan interventions before you even mark their work;
- indicate the extent to which a learning community is developing in your class;
- provide you with a “before and after” snapshot of what kinds of interactions happened before and after you intervened/changed your learning activity design (useful to see what effect your changes have had on student interactions and for demonstrating reflective teaching practice e.g. through a teaching portfolio)
- allow your students to benchmark their performance without the need for marking.
But I think this is just the tip of the iceberg. The reason I first reached for social network analysis way back when was to settle an age-old debate about which discussion board interface is “best.” My contention has always been that different discussion board UIs foster different kinds of conversation. If you want a Q&A-style conversation where people post atomic questions and get a string of direct answers, use a threaded interface. If you want a wide-ranging conversation in which you get a lot of student-to-student interactions, use a flat interface. A social network analysis tool like this one, particularly if it could aggregate data across multiple discussion boards and multiple courses, could help to answer this question as well as others, e.g., what are the differences in student interactions when you use course blogs with comments versus a discussion board?
We can do yet more with social network analysis if we can shake off the shackles of current-generation LMS architecture and thinking. For example, today’s LMS encourages us to think of every course as atomic and in its own box. We might do cross-course analysis to see, say, trends in terms of which LMS tools are getting used the most, but we don’t look at this wealth of transactional data to analyze students’ cross-course behavior. To the extent that schools do student cross-course performance analytics at all, they generally look at the longitudinal data in their SIS. But we could be investigating, for example, the degree to which students’ educational social networks extend beyond an individual class. Are there students that “travel” together across courses? If so, do they tend to interact with each other more than mean student-to-student interaction within particular courses? If so, does that impact their overall performance relative to their peers? If so, can we take measures that encourage students to form these cohorts, and identify students who don’t participate in these cohorts as at-risk? And so on. Imagine embedding this sort of analysis into a student’s ePortfolio as well. We could begin to give students an opportunity to show not only what they’ve learned and how much they’ve improved but also how they have learned and improved.
We also can do more as LMSs get more sophisticated about content ownership, content re-use, and social connections. For example, one instructor re-using another instructor’s learning object can be graphed as a social connection. What can we learn about fostering a culture of re-use by graphing patterns of re-use? To what extent is content re-use driven by relatively anonymous search-type discovery methods, and to what extent is it a social activity?
There’s a good deal of social network analysis software out there, both open source and private source. I’d love to see LMS development communities tackle this challenge in a more wide-ranging way.
Luke Fernandez says
Bundled in Moodle is a nice little survey tool called the Constructivist On-Line Learning Environment Survey (COLLES). (cf. http://surveylearning.moodle.com/colles/ ). It’s not as sophisticated as SNAPP apparently is. But it’s easy to use and it’s helped me see whether the levels of interaction I provide in my course are up to my student’s expectations. Here’s Martin’s description of it http://dougiamas.com/writing/herdsa2002/
Michael Feldstein says
Luke, the two tools are very different and largely complementary. SNAPP gives you a way of visualizing who has interacted with whom how often. It graphs the conversations.
Jurriaan van Reijsen says
Thanks a lot for this post. I am happy to finally find a read that combines my work as a Ph.D researcher (Social Network Analysis) and my work as a consultant (Learning Technology).
My colleagues and I have been doing research in the field of Social Network Analysis, specifically focusing on Knowledge flow and (generic) organizational performance. Our tool of use has always been Cyram’s Netminer.
The idea to integrate SNA functionality in an LMS is indeed very interesting. It may tell us more about both behavior/performance of students and the courses itself.
However, SNA functionality does not need to be technically integrated in an LMS. As long as one can extract useful data from an LMS (e.g. tick who chats with whom (forum); what students follow the same courses (in order to detect cliques)), you’re good to go. The extracted data can be converted to a so called Adjacency Matrix (can be done in e.g. Excel). Next, many SNA tools can simply import the manually generated Adjacency Matrix and create sociograms and calculate social network indicators based on its data.
That being said, of course it would be a win if such functionality would be integrated (automated) in an LMS.
Next to the idea of generating sociograms based on people interaction, visualizing object interaction (e.g. content re-use in an LCMS) is also appealing. Our department at the Utrecht University developed a method that visualizes a network of scientific papers, based on keywords from paper abstracts. Another league, but the same idea I guess.
Some things to bear in mind:
– How can we ensure that the measured interaction (e.g. forum posts) is in fact relevant interaction (e.g. concerns the course instead of asking to go play tennis tonight?). The content of the interaction will influence and can falsify the measured network.
– How to take privacy into account? Sociograms (visualizations) and SNA metrics (indicators) are great data but they openly “position” people relative from each other.
I’m looking forward to see developments in this field, where LMS/LCMS vendors start to integrate SNA functionality (visualization and metrics) into their products.
Regards,
Jurriaan van Reijsen MSc.
Utrecht University, The Netherlands
Michael Feldstein says
Juriaan, the reason I would like to see SNA tools built into the LMS itself is that I would like to lower the barrier to entry so that teachers and students can use it without a lot of training or extra software. You raise some excellent points about conducting rigorous SNA research, but I’m looking to start by giving individual students and teachers an opportunity to visualize their own social interactions and draw their own conclusions. This approach would minimize the ethical questions (and, to a much lesser extent, the methodological ones as well). At the same time, if we’re doing the analysis at an institutional level and looking only at academic interactions (which can be narrowly defined as in-course interactions if need be), I’m not sure that the privacy issue is any worse than doing, say, longitudinal academic success analytics like an early warning system. Is the fact that a student tend to respond to the comments of the same two classmates in class discussions across a range of courses (or that she doesn’t respond to other students at all) data that is more private than knowing her performance on individual assignments across the same range of courses?
Michael Moroney says
I have used SNAPP quite a lot over the last year. I have found it useful for getting a quick snapshot of group interaction characteristics in a teacher professional development programme. I also introduced SNAPP to some preservice teachers. Both students and teachers found the software easy to use. The preservice teacher students had no trouble exporting SNAPP data to NetDraw and UCINET for further analysis.
I think the challenge with teachers using SNAPP will be to help them to understand the SNAPP metrics and visualisations that are produced. As always, the technology is only a part of a solution, and without all the parts (e.g online screencasts and tutorials for both SNAPP and SNA), the system may not be successfully employed as a formative assessment instrument.
On the issue of research – I do get concerned about the use of digital data without the informed consent of participants. There are some issues I can see that may prevent research getting ethics committee approval. I am fairly sure research participants generally do not understand how much information they are giving away when they go online
My experience to date (using web analytics, SNA and content analysis) is that no single approach has been the final answer, and that a range of approaches that are easy to implement need to be triangulated with each other. In my expereince, the longest and most detailed online discussion of my inservice teachers was a conversation about where to have lunch. The sociogram looked very good in this case.