D’Arcy Norman started a lively inter-blog conversation like we haven’t seen in the edublogosphere in quite a while with his post on the false binary between LMS and open. His main point is that, even if you think that the open web provides a better learning environment, an LMS provides a better-than-nothing learning environment for faculty who can’t or won’t go through the work of using open web tools, and in some cases may be perfectly adequate for the educational need at hand. The institution has an obligation to provide the least-common-denominator tool set in order to help raise the baseline, and the LMS is it. This provoked a number of responses, but I want to focus on Phil’s two responses, which talk at a conceptual level about building a bridge between the “walled garden” of the LMS and the open web (or, to draw on his analogy, keeping the garden but removing the walls that demarcate its border). There are some interesting implications from this line of reasoning that could be explored. What would be the most likely path for this interoperability to develop? What role would the LMS play when the change is complete? For that matter, what would the whole ecosystem look like?
Seemingly separately from this discussion, we have the new Unizin coalition. Every time that Phil or I write a post on the topic, the most common response we get is, “Uh…yeah, I still don’t get it. Tell me again what the point of Unizin is, please?” The truth is that the Unizin coalition is still holding its cards close to its vest. I suspect there are details of the deals being discussed in back rooms that are crucial to understanding why universities are potentially interested. That said, we do know a couple of broad, high-level ambitions that the Unizin leadership has discussed publicly. One of those is to advance the state of learning analytics. Colorado State University’s VP of Information Technology Pat Burns has frequently talked about “educational Moneyball” in the context of Unizin’s value proposition. And having spoken with a number of stakeholders at Unizin-curious schools, it is fair to say that there is a high level of frustration with the current state of play in commercial learning analytics offerings that is driving some of the interest. But the dots have not been connected for us. What is the most feasible path for advancing the state of learning analytics? And how could Unizin help in this regard?
It turns out that the walled garden questions and the learning analytics questions are related.
The Current State of Interoperability
Right now, our LMS gardens still have walls and very few doors, but they do have windows, thanks to the IMS LTI standard. You can do a few things with LTI, including the following:
- Send a student from the LMS to someplace elsewhere on the web with single sign-on
- Bring that “elsewhere” place inside the LMS experience by putting it in an iframe (again, with single sign-on)
- Send assessment results (if there are any) back from that “elsewhere” to the LMS gradebook.
The first use case for LTI was to bring in a third-party tool (like a web conferencing app or a subject-specific test engine) into the LMS, making it feel like a native tool. The second use case was to send students out to a tool that needed to full control of the screen real estate (like an eBook reader or an immersive learning environment) but to make that process easier for students (through single sign-on) and teachers (through grade return). This is nice, as far as it goes, but it has some significant limitations. From a user experience perspective, it still privileges the LMS as “home base.” As D’Arcy points out, that’s fine for some uses and less fine for others. Further, when you go from the LMS to an LTI tool and back, there’s very little information shared between the tool. For example, you can use LTI to send a student from the LMS to a WordPress multiuser installation, have WordPress register that student and sign that student in, and even provision a new WordPress site for that student. But you can’t have it feed back information on all the student’s posts and comments into a dashboard that combines it with the student’s activity in the LMS and in other LTI tools. Nor can you use LTI to aggregate student posts from their respective WordPress blogs that are related to a specific topic. All of that would have to be coded separately (or, more likely, not done at all). This is less than ideal from both user experience and analytics perspectives.
Enter Uniz…Er…Caliper
There is an IMS standard in development called Caliper that is intended to address this problem (among many others). I have described some of the details of it elsewhere, but for our current purposes the main thing you need to know is that it is based on the same concepts (although not the same technical standards) as the semantic web. What is that? Here’s a high-level explanation from the Man Himself, Mr. Tim Berners-Lee:
The basic idea is that web sites “understand” each other. The LMS would “understand” that a blog provides posts and comments, both of which have authors and tags and categories, and some of which have parent/child relationships with others. Imagine if, during the LTI initial connection, the blog told the LMS about what it is and what it can provide. The LMS could then reply, “Great! I will send you some people who can be ‘authors’, and I will send you some assignments that can be ‘tags.’ Tell me about everything that goes on with my authors and tags.” This would allow instructors to combine blog data with LMS data in their LMS dashboard, start LMS discussion threads off of blog posts, and probably a bunch of other nifty things I haven’t thought of.
But that’s not the only way you could use Caliper. The thing about the semantic web is that it is not hub-and-spoke in design and does not have to have a “center.” It is truly federated. Perhaps the best analogy is to think of your mobile phone. Imagine if students had their own private learning data wallets, the same way that your phone has your contact information, location, and so on. Whenever a learning application—an LMS, a blog, a homework product, whatever—wanted to know something about you, you would get a warning telling you which information the app was asking to access and asking you to approve that access. (Goodbye, FERPA freakouts.) You could then work in those individual apps. You could authorize apps to share information with each other. And you would have your own personal notification center that would aggregate activity alerts from those apps. That notification center could become the primary interface for your learning activities across all the many apps you use. The PLE prototypes that I have seen basically tried to do a basic subset of this capability set using mostly RSS and a lot of duct tape. Caliper would enable a richer, more flexible version of this with a lot less point-to-point hand coding required. You could, for example, use any Caliper-enabled eBook reader that you choose on any device that you choose to do your course-related reading. You could choose to share your annotations with other people in the class and have their annotations appear in your reader. You could share information about what you’ve read and when you’ve read it (or not) with the instructor or with a FitBit-style analytics system that helps recommend better study habits. The LMS could remain primary, fade into the background, or go away entirely, based on the individual needs of the class and the students.
Caliper is being marketed as a learning analytics standard, but because it is based on the concepts underlying the semantic web, it is much more than that.
Can Unizin Help?
One of the claims that Unizin stakeholders make is that the coalition can can accelerate the arrival of useful learning analytics. We have very few specifics to back up this claim so far, but there are occasionally revealing tidbits. For example, University of Wisconsin CIO Bruce Mass wrote, “…IMS Global is already working with some Unizin institutions on new standards.” I assume he is primarily referring to Caliper, since it is the only new learning analytics standard that I know of at the IMS. His characterization is misleading, since it suggests a peer-to-peer relationship between the Unizin institutions and IMS. That is not what is happening. Some Unizin institutions are working in IMS on Caliper, by which I mean that they are participating in the working group. I do not mean to slight or denigrate their contributions. I know some of these folks. They are good smart people, and I have no doubt that they are good contributors. But the IMS is leading the standards development process, and the Unizin institutions are participating side-by-side with other institutions and with vendors in that process.
Can Unizin help accelerate the process? Yes they can, in the same ways that other participants in the working group can. They can contribute representatives to the working groups, and those representatives can suggest use cases. They can review documents. They can write documents. They can implement working prototypes or push their vendors to do so. The latter is probably the biggest thing that anyone can do to move a standard forward. Sitting around a table and thinking about the standard is good and useful, but it’s not a real standard until multiple parties implement it. It’s pretty common for vendors to tell their customers, “Oh yes, of course we will implement Caliper, just as soon as the specification is finalized,” while failing to mention that the specification cannot be finalized until there are implementers. What you end up with is a bunch of kids standing around the pool, each waiting for somebody else to jump in first. In other words, what you end up with is paralysis. If Unizin can accelerate the rate of implementation and testing of the proposed specification by either implementing themselves or pushing their vendor(s) to implement, then they can accelerate the development of real market solutions for learning analytics. And once those solutions exist, then Unizin institutions (along with everyone else) can use them and try to discover how to use all that data to actually improve learning. These are not unique and earth-shaking contributions that only Unizin could make, but they are real and important ones. I hope that they make them.
[…] Watters, D’Arcy Norman, Phil Hill, Michael Feldstein, Jared Stein and Jonathan Rees have all contributed to this thoughtful and detailed conversation; […]