This is a version of my recent IMS talk on why the educational software interoperability challenges of the next decade will be different from the ones of the past.
In this post, I explore the relationship between learning engineering and learning design, talk about language as a design artifact, and provide an example about how Caliper could be the centerpiece of a learning engineering process for developing better learning analytics.
LTI 2.0 has failed. This is a great opportunity to take a healthier direction.
Whether you call it NGDLE, an LMOS, a learning platform, or something else, people have been wanting a next-generation post-LMS for a long time. We finally have both the interoperability standards and the market incentives to make it possible—if the LMS vendors are willing to take a risk.
Two years ago, I wrote about how D2L’s analytics package looked serious and potentially ground-breaking, but that there were serious architectural issues with the underlying platform that were preventing the product from working properly for customers. Since then, we’ve been looking for signs that the company has dealt with these issues and is ready to […]
I have been meaning for some time to get around to blogging about the EDUCAUSE Learning Initiative’s (ELI’s) paper on a Next-Generation Digital Learning Environment (NGDLE) and Tony Bates’ thoughtful response to it. The core concepts behind the NGDLE are that a next-generation digital learning environment should have the following characteristics: Interoperability and Integration Personalization […]
During yesterday’s K-20 learning platform panel at IMS Global’s Learning Impact Leadership Institute (the panel that replaced the LMS Smackdown of year’s past), Scott Jaschik started the discussion off by asking “what is the LMS?”. As I have recently complained about our Saturn Vue that replaced a Chrysler Town & Country, the answer I provided was that […]