The transformation of the higher education LMS market continues, and I expect more changes over the next 2 – 3 years. However, it seems time to capture the state of the market based on changes over the past year or two.
I shared the most recent graphic summarizing the market in mid 2011. As with all previous versions, the 2005 – 2009 data points are based on the Campus Computing Project, and therefore is based on US adoption from non-profit institutions. This set of longitudinal data provides an anchor for the summary.
The most significant changes over the past two years include the following.
- The data has been adjusted to include international usage and online programs in order to capture the rise of online programs, including MOOCs, as a driving force in the future market. Keep in mind that there is no consistent data set to capture the entire market, so treat the graphic as telling a story of the market rather than being a chart of precise data. Sources for this summary include a combination of Campus Computing reports, ITC surveys, company press releases, and extrapolations from Blackboard’s and Pearson’s quarterly earnings. Caveat emptor. Update (12/19): While the data captures a broader set of information than previous US-only data, it is primarily focused on north american institutions.
- There is a new band / category for “homegrown systems” to account for a relatively new trend where organizations, primarily MOOCs for now, are opting to develop their own learning platform rather than adopt a pre-existing LMS.
- Instructure has established itself as not just as a disruptive influence, but as a full-fledged competitor in the market.
- Blackboard changed their strategy, purchased two Moodle service providers (MoodleRooms and NetSpot), and cancelled the end-of-life for the ANGEL LMS.
- Desire2Learn has grown much faster than has been represented by US-only data.
- Pearson eCollege has a much stronger position when considering their market strength in the for-profit sector and with fully online programs.
- The gray band representing pricing has been removed, due to the rise in open source alternatives and change in market pricing pressures.
Feel free to comment below, at the Google+ post, or by Twitter (@PhilOnEdTech or #lmsgraphic).
Update (10/19): Changed Instructure’s logo to the new Canvas by Instructure brandining
Phil Hill says
Kate (question from Twitter) asks for more information on how I have started to include international data post 2009. Think incremental as opposed to trying to solve the whole problem. I took the base US information, then adjusted each LMS provider based on available public information. For example, Desire2Learn has quite a few Canadian customers that are not typically counted in US-focused surveys. I dug into their number of licenses and number of clients to determine additional growth or market share.
I then used number of employees – comparing vs. Blackboard’s approximate # of employees for LMS-related products (the Learn stack). This information was used for a sanity check.
So really an attempt to extend US data to be more international.
I suspect the LMS that would come out larger in market share if there were more accurate data on international would be Moodle.
Bob Puffer says
The graph is ludicrous in its accuracy. Moodle has 21k sites installed in the United States alone (assuming you’re looking only at North America) after test sites are filtered out including University of Minnesota and LSU and numerous other major univerities including the largest chunk of California University System. Desire-2-Learn (who looks on your graph as being right up there with the big guys) now has 700 customers (their data, 2012). “Instructure has established itself as full-fledge competitor” having signed up 170 clients over the last eighteen months (their data, 2012). And to portray a graph of this nature based on three-year-old key data is to entirely misrepresent the LMS market and ignore the immense changes that have taken place during that period. I read your blog religiously and am surprised at the quality of this offering.
Bob Puffer says
Desire-2-Learn’s employee ratio has grown degrees of magnitude faster than their customer base (NOTE 1 below). Lousy number to extrapolate from especially when you’re considering open source in the mix. Moodle has around a dozen employees if my memory serves me correct. They must be one of the smallest (at 68k worldwide after filtering for test sites).
NOTE 1: taken from this eLiterate posting: https://eliterate.us/desire2learn-raises-80-million-in-venture-funding-5th-largest-series-a-funding/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+mfeldstein%2Ffeed+%28e-Literate%29
Bob Puffer says
Sorry to keep haranguing on this post but it seems so unlike eLiterate. To extrapolate numbers from 2005-2009 to get three years of numbers is to completely ignore the state budget crunch that has forced the hands of tons of state universities and community colleges (maybe community colleges weren’t considered part of the higher eduction market, if so I need to let my colleagues know so they’ll quit being so concerned about all the business community colleges are taking away).
Phil Hill says
Bob, caveat emptor. There is definitely a subjective nature to much of this data, and I’ll let others (including you) judge whether it tells the story accurately.
I will say that Moodle is very difficult to track and compare for several reasons. One is that Moodle is typically the most-often used secondary system on a campus, where this information focuses on the primary LMS. Another is Moodle’s strong international usage. So I will say again that if we had an accurate way to measure worldwide usage in a consistent manner that Moodle would could out larger on this chart. I have anchored to US-data and extended to show a more complete picture.
I am only using the # employees as a sanity check on rough size – not as a determination of market size.
Again, the graphic is there to tell a story on LMS market dynamics. If it doesn’t work for you, that’s fine.
Bob Puffer says
On what basis other than noisiness in the marketplace (and ability to attract interesting VC) have you accorded Instructure to be a full-fledged competitor in the market? I subscribe to their open-source forums and can tell you there’s no community reflected in those at all (2-4 posts/ day). 170 in eighteen months?
Phil Hill says
Bob – Utah 109k, Washington SBCTC 220k, Washington (3 unis), USF 47k, UCF 50+k, Maryland, Auburn, UT Austin, etc, etc. Plus, in my consulting I see them competitive in most LMS selection processes (at those not purely online), backing up assertion.
Ed Garay says
It’ be interesting to superimpose the 2010 graphic to see how the LMS market share has changed, at least, for Desire2Learn, Moodle, Blackboard, Sakai and Pearson Learning Studio.
Good to see Instructure Canvas included, but wonder why other emerging cloud-based LMS alternatives like Pearson OpenClass, Interactyx Topyx, Lore (i.e. CourseKit) and Knoodle were left out. If Pearson’s estimates of OpenClass evaluation and adoption rates are accurate, I would have expected to see OpenClass’ market share visualized as well. Next month, it will be a year since OpenClass was announced, with much fun fare. It’d be good to know where they are at, today.
As for Blackboard, the visuals that I’d like to see is the percentages of original Blackboard, WebCT and Angel schools that stay with or leave Blackboard. It seems to me that most of the schools that drop Blackboard were original WebCT and Angel schools, probably, unhappy that their LMS of choice was acquired by Blackboard.
Bob Puffer says
Need to correct an earlier assertion — Moodle has 11k installation sites in the US after filtering for test sites. A useful url for getting some reliable figures for large institution moodle usage is: http://docs.moodle.org/23/en/Category:Installations
Michael Feldstein says
This is a thorny problem. The gold standard for LMS adoption data is Casey Green’s Campus Computing Project survey. He has a solid methodology, consistent questions with longitudinal data, and a good sample size. But there is a trade-off. Casey’s survey is strictly about LMS adoption as the main institutional platform for U.S. not-for-profit schools over a certain size. This approach gives us the best data-driven read we have on the state of the industry but, by itself, it is not going to pick up on things like MOOC trends or international growth. At the other end of the spectrum is the data Bob is citing about Moodle installation sites. This captures some very broad adoption data, but it doesn’t tell you a lot about institutional trends.
What I take Phil as trying to do here is tell a broad, qualitative story about institutional trends that is more sensitive to leading indicators than we can get from our best quantitative measures, at cost of certainty and precision. It is a visualization of his judgment based on data rather than a straight-up graph of a data set. If so, then the questions that would make the graph most informative are the ones that zero in on the places where Phil uses other inputs to make a judgment beyond what the Campus Computing data can tell us. For example, it looks like adoption of Instructure is incrementally larger than Sakai, but adoption of homegrown systems (primarily MOOC platforms) is incrementally larger than Instructure. Particularly in the case of the MOOC platforms, where an apples-to-apples comparison is problematic, I’d be interested in hearing more about how those various inputs led to that judgment.
Bob Puffer says
I think that’s a great summation of the deep story. I wish the post had been entitled differently than, “The State of the Higher Education LMS Market: A Graphical View”. Somehow when I read that headline I think I’m going to receive a precise view because “state” is used in technology circles to mean something very precise. The word, “state” is obviously used this way many other places like, “the State of the Union” which also implies “the whole story”. When I saw this article and that it came from eLiterate I was ready to send it to Diigo because I would be referring back to it frequently. Didn’t pan out that way.
Phil Hill says
Ed, I didn’t include OpenClass, Lore, etc, for two reasons.
– Their model targets individual faculty adoption, rather than institutional adoption. Over time we’ll need to find market adoption data by faculty, course or student. For now, that data just doesn’t exist.
– I have not been able to find any independent confirmation of significant usage for these options (yet). OpenClass shared download data, but I’m focusing on real adoption. If anyone can has data on these systems, let me know.
Kate Bowles says
To take up Michael’s point, I think the problem for the rest of us relates to our customary location outside the world of gold standard US data. There’s not much point in us getting worked up about it, especially those of us in very small education markets (there are fewer than 40 universities in Australia, and even adding in the TAFE sector, we have a higher education market that’s appropriate to a population of 21m, and a near duopoly in LMS providers to match).
But as MOOCs appear to be globalising something about education, and something about platforms, then it’s worth trying anything to encourage that gold standard dataset, and the thinking that often goes with it, to be a bit less parochial.
I feel that Phil has raised a really important point about the need to try to find ways to accommodate US base data to the changing reality of the world market for LMS providers; surely things will get easier in the future, as we get better at sharing data across educational markets.
Michael Feldstein says
Bob, I’m not suggesting in any way that I think Phil’s judgment is off. To the contrary. I’m not sure why you think that the activity level in Instructure’s open source forums is a good indicator of institutional adoption, but it certainly isn’t a measure that I would use. Rather, my point is that there is an interesting discussion to be had about how we apply judgment to infer the scope of important changes in the market that are beyond the reach of our most accurate survey instruments to measure rigorously.
Phil Hill says
What Michael and Kate said :} (although they were more eloquent)
Michael, you raise a good point about how to apply judgment to limited data sets. I’ll likely put together a post that describes the subjective methods used to create the graphic. The example you raise regarding MOOCs is worth a quick note here.
Online education and the associated ed tech is changing significantly, and apples-to-apples comparisons to new forms such as xMOOCs are quite problematic. Do you base the adoption on number of institutions adopting xMOOCs as the basis for their online programs? If so, then we probably have 30 – 40 US institutions alone if you count Coursera partnerships as well as edX, which would put MOOCs smaller than Instructure. There is even an argument that this number is generous, as each Coursera school has a dozen or less courses in pilot mode.
Or, do you base the adoption on number of students enrolled? If so, then we’re talking 2+ million and growing fast. This measurement has flaws, due to high attrition rates of xMOOC courses.
There is also the issue about homegrown learning platforms. 10 – 15 years ago, a homegrown LMS (or CMS really) was much more prevalent, but the market consolidated in the early 2000s on Blackboard, WebCT, Desire2Learn, ANGEL, Moodle and Sakai. An interesting story is that, driven partially by MOOCs, the homegrown learning platform is coming back, albeit in different forms. Beyond MOOCs, however, you have University of Phoenix, Rio Salado, Kaplan all using homegrown LMS at some level.
In the end, I chose to represent “Homegrown Systems” as the new band, combining xMOOCs with UoP, Kaplan, etc. The size of this combined band, in terms of #s of students, is roughly the same as Instructure (2.5 – 3.0m, not including Cisco NA). The incremental growth, however, has taken place over a shorter period of time.
Although there are strong arguments that I should adjust the homegrown #s down due to attrition, etc, I chose not to do so, in order to highlight the importance of this new trend.
Quite subjective, but this was the basis for the new band that includes MOOCs. Of course, I would be interested in feedback on these assumptions / judgments.
Bob Puffer says
I’d concede there are logistical problems all over with apprehending credible data for a study like this but I’d also contend the best sources have been accorded unfairly low weight.
I’m suggesting that DECISIONS to go with a particular LMS are more valid measures than numbers of students. Large schools are equally capable of making foolish or wise decisions as small schools. Therefore, the valid measure for impact on this arena should be number of customers.
You’ve included Instructure amidst the open source alternatives. As is well documented by Apache, Linux (most flavors), Moodle and others the true measure of impact and even viability for an open source product (even a hybrid like Canvas) is measured by community engagement. I therefore believe activity in the open source forums for Instructure is a valid data point in determining their real impact on the LMS market.
As for extrapolation, in 2004FA when we began piloting Moodle they had 3,600 sites. In 2012FA they have 68k sites. Is that growth accurately represented in your chart?
MOOCs should not be on the chart IMO as their valid measurements are entirely different.
Kate Bowles says
Whle we’re all searching for the apples in the fruit bowl, I think it’s also worth remembering that this bowl has other fruit in it, and a whole lot of non-fruit items. I’m currently closely involved in an institutional transition from one LMS to another, and I’m interested in how even a simple institution like ours will typically be running both the main LMS and a range of other small scale home-brand installations around the place (what I think of as Moodle-under-the-desk), while an increasing number of Faculty are using social options, especially WordPress, to replace what we would have expected an LMS to do for them.
The issue here is the systematic underdevelopment of social learning tools in most of the LMS products we know. It’s very hard to compete with Facebook for students’ social attention when using an LMS that looks and feels like a spreadsheet (there’s lots to love about Moodle, but right out of the box, it isn’t pretty). So Faculty are moving into what Bryan Alexander’s been calling the “post-LMS” world, and some vendors are looking into leveraging this with their “one professor at a time” strategies. Instructure have really walked the halls on this one.
I’m really curious to know how Phil will tackle this dispersal of options in a future graphic. It’s the heart of the problem of trying to measure LMS use by LMS institutional contracts.
Kate
Phil Hill says
Kate, I’m curious also and have no idea yet.
You make a very good point about the tools, and I definitely think we need to get there (understanding the dispersal of tools and “post-LMS” world). I have referenced this before, but Patrick Masson and team at UMassOnline tackled this problem directly – measuring adoption in their institutions (multi-campus consortium) rather than seeking data from outside. In fact, they built this usage of disparate tools in their NIFTI framework for enterprise software support. More info at our joint webinar for WCET:
http://wcet.wiche.edu/connect/no-lms-rfp
To me, this might make sense – looking for institutional data collection to understand tool adoption, before looking for larger surveys. This approach might be more effective in identifying the right data to collect, then having the surveys / data collection move to a cross-institution approach.
Bob Puffer says
I’d challenge whether social tools have a place inside, around or anywhere near an LMS and perhaps even an institution of higher learning unless they provide significant and credible peer feedback or integrated opportunities for collaboration. (Too bad Bryan’s not tuned into this discussion — he’d rail on me). I know this is counter to today’s accepted wisdom but somebody needs to explain to me the pedagogy inherent in social tools. Anecdotally, our k12 adopted Edmodo. The kids loved it, the teachers loved it. All through the year there were reports of how well it worked but at the end of the year when rational people sat down to discern and analyze what had been gained, they had a very hard time coming up with how it contributed to learning in any real way.
Phil Hill says
I understand the need to have healthy skepticism on social tools (or any new tool, for that matter), but two points:
– Social doesn’t just mean social streams and facebook duplication; there are social aspects to getting students to write and discuss in a social environment (like we’re doing here); and
– While there are questions about the effectiveness and place for social tools, I think the jury is out and there is plenty of room for faculty experimentation and adjustments.
josh coates says
phil, michael, thanks for your work on attempting to graphically define this complicated market based on campus computing data. we all know that this is based on imperfect information, and it’s a best effort estimation of what the market trend looks like. thanks for your efforts.
bob, moodle is a difficult LMS to factor into this kind of report. as you know, even though moodle is “installed under the desk” in thousands of places, it’s the dedicated LMS of only a fraction of that many institutions, which is what this graphic is trying to measure.
i agree with you that MOOCs aren’t the same thing as an LMS and should be accounted for differently.
and for the record, as of today, 20 months after launch, instructure has just over 250 educational institution under contract, and 160 employees. 😉
Kate Bowles says
On the social side, Rey Junco’s studies of the impact of educationally relevant social media use (FB and Twitter) on grades and engagement are pretty sturdy. Students using public social networks to participate in or extend class activities are simply working in more dynamic and graphically interesting environments than those supplied in their campus LMS, but their function is the same — and more or less the same as discussion in a face to face class. This is why both MOOCs and LMS design include forums as a standard, even if their design and use is clunky. But it’s the clunky design of these social tools that’s driving Faculty experimentation, I think, and we’re now seeing much more fully featured use of social tools or networks (Ning being a good example) in the all-but-grading model of substitute LMS. The real problem here is that if you’re learning in one system but grading in another — a really common pattern of Faculty practice — where do you park your analytics plug in, now that analytics are the new black?
Bob Puffer says
I doubt the ability of today’s professor to adequately reign in the use of social media for educational purposes only so I guess I disagree with the conclusions you’ve drawn from Junco’s research. I see the introduction of social media into the classroom as legitimizing a behavior that Junco clearly concludes negatively impacts grades (hoping we can assume grades somehow correlate to outcomes).
And tag me as skeptical when I’m knee-deep in an environment that seems to proclaim, “as long as it has the term ‘social’ or ‘analytics’ or ‘sustainable’ wrapped around it, you are guaranteed success”.
Phil Hill says
FYI, I just placed a comment thread in ‘pending’ status, based on e-Literate’s editorial policy. The intent for comments is to stay on topic, and we were getting into a ‘which system or community is better’ discussion, when this topic is about market adoption.
As a reminder: “Comments are invited but should be topical and civil. If your comment is judged inappropriate or offensive, it will be deleted. Commenters who are suspected of using fake names to preserve anonymity will be held to a significantly higher standard than those who use their real names. Comments made using fake email addresses will be deleted regardless of content, without exception.”
Bob Puffer says
Phil, is there someplace we can view this “pending” comment thread? I moderate a couple of forums and we’ve recently dealt with similar situations so I’m wondering what type of posting crosses your lines. Maybe it would be enough to see your editorial policies. Thanks.
Michael Feldstein says
The editorial policies are, conveniently, reachable through the top-level menu page on this blog entitled “Editorial Policies.”
cullenhoover says
In short, this says that everyone is improving there strategies and techniques and I think this should be done to provide better flexibility to users. As education is changing LMS should reach to high levels too.