This article is cross-posted to the WCET blog.
After billions of dollars spent on administrative computer systems and billions of dollars invested in ed tech companies, the U.S. higher education system is woefully out of date and unable to cope with major education trends such as online & hybrid education, flexible terms, and the expansion of continuing and extended education. Based on an investigation of the recently released distance education data for IPEDS, the primary national education database maintained by the National Center for Education Statistics (NCES), we have found significant confusion over basic definitions of terms, manual gathering of data outside of the computer systems designed to collect data, and, due to confusion over which students to include in IPEDS data, the systematic non-reporting of large numbers of degree-seeking students.
In Fall 2012, the IPEDS (Integrated Postsecondary Education Data System) data collection for the first time included distance education – primarily for online courses and programs. This data is important for policy makers and institutional enrollment management as well as for the companies serving the higher education market.
We first noticed the discrepancies based on feedback from analysis that we have both included at the e-Literate and WCET blogs. One of the most troubling calls came from a state university representative that said that the school has never reported any students who took their credit bearing courses through their self-supported, continuing education program. Since they did not include the enrollments in reporting to the state, they did not report those enrollments to IPEDS. These were credits toward degrees and certificate programs offered by the university and therefore should have been included in IPEDS reporting based on the following instructions.
Include all students enrolled for credit (courses or programs that can be applied towards the requirements for a postsecondary degree, diploma, certificate, or other formal award), regardless of whether or not they are seeking a degree or certificate.
Unfortunately, the instructions call out this confusing exclusion (one example out of four):
Exclude students who are not enrolled for credit. For example, exclude: Students enrolled exclusively in Continuing Education Units (CEUs).
How many schools have interpreted this continuing education exclusion to apply to all continuing education enrollments? To do an initial check, we contacted several campuses in the California State University system and were told that all IPEDS reporting was handled at the system level. Based on the introduction of the Fall 2012 distance education changes, Cal State re-evaluated whether to change their reporting policy. A system spokesman explained that:
I’ve spoken with our analytic studies staff and they’ve indicated that the standard practice for data reporting has been to share only data for state-supported enrollments. We have not been asked by IPEDS to do otherwise so when we report distance learning data next spring, we plan on once again sharing only state-supported students.
Within the Cal State system, this means that more than 50,000 students taking for-credit self-support courses will not be reported, and this student group has never been reported.
One of the reasons for the confusion as well as the significance of this change is that continuing education units have moved past their roots of offering CEUs and non-credit courses for the general public (hence the name continuing education) and taking up a new role of offering courses not funded by the state (hence self-support). Since these courses and programs are not state funded, they are not subject to the same oversight and restrictions as state-funded equivalents such as maximum tuition per credit hour.
This situation allows continuing education units in public schools to become laboratories and innovators in online education. The flip side is that given the non-state-funded nature of these courses and programs, it appears that schools may not be reporting these for-credit enrollments through IPEDS, whether or not the students were in online courses. However, the changes in distance education reporting may actually trigger changes in reporting.
Do Other Colleges Also Omit Students from Their IPEDS Report?
Given what was learned from the California State University System, we were interested in learning if other colleges were having similar problems with reporting distance education enrollments to IPEDS. WCET conducted a non-scientific canvassing of colleges to get their feedback on what problems they may have encountered. Twenty-one institutions were selected through a non-scientific process of identifying colleges that reported enrollment figures that seemed incongruous with their size or distance education operations. See the “Appendix A: Methodology” for more details.
From early August to mid-September, we sought answers regarding whether the colleges reported all for-credit distance education and online enrollments for Fall 2012. If they did not, we asked about the size of the undercount and why some enrollments were not reported.
Typically, the response included some back-and-forth between the institutional research and distance education units at each college. Through these conversations, we quickly realized that we should have asked a question about the U.S. Department of Education’s definition of “distance education.” Institutions were very unclear about what activities to include or exclude in their counts. Some used local definitions that varied from the federal expectations. As a result, we asked that question as often as we could.
The Responses
Twenty institutions provided useable responses. We agreed to keep responses confidential. Table 1 provides a very high level summary of the responses to the following two questions:
- Counts Correct? – Do the IPEDS data reported include all for-credit distance education and online enrollments for Fall 2012?
- Problem with “Distance Education” Definition? – Although we did not specifically ask this question, several people volunteered that they had trouble applying the IPEDS definition.
Table 1: Counts for Institutional Responses
Counts Correct? | Problem with "Distance Education" Definition? | |
---|---|---|
Yes | 11 | 3 |
Maybe | 5 | 5 |
No | 4 | 12 |
Of those that assured us that they submitted the correct distance education counts, some of them also reported having used their own definitions or processes for distance education. This would make their reported counts incomparable to the vast majority of others reporting.One institution declined to respond. Given that its website advertises many hundreds of online courses, the distance education counts reported would leave us to believe that they either: a) under-reported, or b) average one or two students per online class. The second scenario seems unlikely.
Findings
This analysis found several issues that call into question the usability of IPEDS distance education enrollment counts and, more broadly and more disturbingly, IPEDS statistics, in general.
There is a large undercount of distance education students
While only a few institutions reported an undercount, one was from the California State University System and another from a large university system in another populous state. Since the same procedures were used within each system, there are a few hundred thousand students who were not counted in just those two systems.
In California, they have never reported students enrolled in Continuing Education (self-support) units to IPEDS. A source of the problem may be in the survey instructions. Respondents are asked to exclude: “Students enrolled exclusively in Continuing Education Units (CEUs).” The intent of this statement is to exclude those taking only non-credit courses. It is conceivable that some might misinterpret this to mean to exclude those in the campuses continuing education division. What was supposed to be reported was the number of students taking for-credit courses regardless of what college or institutional unit was responsible for offering the course.
In the other large system, they do not report out-of-state students as they do not receive funding from the state coffers.
It is unclear what the numeric scope would be if we knew the actual numbers across all institutions. Given that the total number of “students enrolled exclusively in distance education courses” for Fall 2012 was 2,653,426, an undercount of a hundred thousand students just from these two systems would be a 4% error. That percentage is attention-getting on its own.
The IPEDS methodology does not work for innovative programs…and this will only get worse
Because it uses as many as 28 start dates for courses, one institutional respondent estimated that there was approximately a 40% undercount in its reported enrollments. A student completing a full complement of courses in a 15-week period might not be enrolled in all of those courses at the census date. With the increased use of competency-based programs, adaptive learning, and innovations still on the drawing board, it is conceivable that the census dates used by an institution (IPEDS gives some options) might not serve every type of educational offering.
The definition of ‘distance education’ is causing confusion
It is impossible to get an accurate count of anything if there is not a clear understanding of what should or should not be included in the count. The definition of a “distance education course” from the IPEDS Glossary is:
A course in which the instructional content is delivered exclusively via distance education. Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.
Even with that definition, colleges faced problems with counting ‘blended’ or ‘hybrid’ courses. What percentage of a course needs to be offered at a distance to be counted in the federal report? Some colleges had their own standard (or one prescribed by the state) with the percentage to be labeled a “distance education” course varied greatly. One reported that it included all courses with more than 50% of the course being offered at a distance.
To clarify the federal definition, one college said they called the IPEDS help desk. After escalating the issue to a second line manager, they were still unclear on exactly how to apply the definition.
The Online Learning Consortium is updating their distance education definitions. Their current work could inform IPEDs on possible definitions, but probably contains too many categories for such wide-spread data gathering.
There is a large overcount of distance education students
Because many colleges used their own definition, there is a massive overcount of distance education. At least, it is an overcount relative to the current IPEDS definition. This raises the question, is the near 100% standard imposed by that definition useful in interpreting activity in this mode of instruction? Is it the correct standard since no one else seems to use it?
In addressing the anomalies, IPEDS reporting becomes burdensome or the problems ignored
In decentralized institutions or in institutions with “self-support” units that operate independently from the rest of campus, their data systems are often not connected. They are also faced with simultaneously having to reconcile differing “distance education” definitions. One choice for institutional researchers is to knit together numbers from incompatible data systems and/or with differing definitions. Often by hand. To their credit, institutional researchers overcome many such obstacles. Whether it is through misunderstanding the requirements or not having the ability to perform the work, some colleges did not tackle this burdensome task.
Conclusions – We Don’t Know
While these analyses have shed light on the subject, we are still left with the feeling that we don’t know what we don’t know. In brief the biggest finding is that we do not know what we do not know and bring to mind former Secretary of Defense Donald Rumsfeld’s famous rambling:
There are known knowns. These are things we know that we know. We also know there are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are ones we don’t know we don’t know.
The net effect is not known
Some institutions reported accurately, some overcounted, some undercounted, some did both at the same time. What should the actual count be?
We don’t know.
The 2012 numbers are not a credible baseline
The distance education field looked forward to the 2012 Fall Enrollment statistics with distance education numbers as a welcomed baseline to the size and growth of this mode of instruction. That is not possible and the problems will persist with the 2013 Fall Enrollment report when those numbers are released. These problems can be fixed, but it will take work. When can we get a credible baseline?
We don’t know.
A large number of students have not been included on ANY IPEDS survey, EVER.
A bigger issue for the U.S. Department of Education goes well beyond the laser-focused issue of distance education enrollments. Our findings indicate that there are hundreds of thousands of students who have never been reported on any IPEDS survey that has ever been conducted. What is the impact on IPEDS? What is the impact on the states where they systematically underreported large numbers of students?
We don’t know.
Who is at fault?
Everybody and nobody. IPEDS is faced with institutional practices that vary greatly and often change from year-to-year as innovations are introduced. Institutional researchers are faced with reporting requirements that vary depending on the need, such as state oversight agencies, IPEDS, accrediting agencies, external surveys and ranking services, and internal pressures from the marketing and public relations staffs. They do the best they can in a difficult situation. Meanwhile, we are in an environment in which innovations may no longer fit into classic definitional measurement boxes.
What to expect?
In the end, this expansion of data from NCES through the IPEDS database is a worthwhile effort in our opinion, and we should see greater usage of real data to support policy decisions and market decisions thanks to this effort. However, we recommend the following:
- The data changes from the Fall 2012 to Fall 2013 reporting periods will include significant changes in methodology from participating institutions. Assuming that we get improved definitions over time, there will also be changes in reporting methodology at least through Fall 2015. Therefore we recommend analysts and policy-makers not put too much credence in year-over-year changes for the first two or three years.
- The most immediate improvement available is for NCES to clarify and gain broader consensus on the distance education definitions. This process should include working with accrediting agencies, whose own definitions influence school reporting, as well as leading colleges and universities with extensive online experience.
Appendix: Methodology
The Process for Selecting Institutions to Survey
The selection process for institutions to survey was neither random nor scientific. A multi-step process of identifying institutions that might have had problems in reporting distance education enrollments was undertaken. The goal was to identify twenty institutions to be canvassed. The steps included:
- A first cut was created by an “eyeball” analysis of the Fall 2012 IPEDS Fall Enrollment database to identify institutions that may have had problems in responding to the distance education enrollment question.
- Colleges that reported distance education enrollments that did not appear to be in scope with the size of the institution (i.e., a large institution with very low distance education enrollments) or what we knew about their distance education operations were included.
- Special attention was paid to land grant colleges as they are likely to have self-funded continuing or distance education units.
- Institutions in the California State University system were excluded.
- This resulted in a list of a little more than 100 institutions.
- The second cut was based upon:
- Including colleges across different regions of the country.
- Including a private college and an HBCU as indicators as to whether this problem might be found in colleges from those institutional categories.
- Twenty institutions were identified.
- In side discussions with a distance education leader at a public university, they agreed to participate in the survey. This brought the total to twenty-one institutions.
Questions Asked in the Survey
- Do the IPEDS data reported include all for-credit distance education and online enrollments for Fall 2012?
- If the IPEDS data reported does not include all for-credit distance education and online enrollments for Fall 2012, approximately how many enrollments are under-counted?
- If the IPEDS data reported does not include all for-credit distance education and online enrollments for Fall 2012, why did you not report some enrollments?
[…] on the IPEDS survey and this blog post are the result of an on-going partnership between Phil Hill (e-literate blog and co-founder of MindWires Consulting, @PhilonEdTech) and WCET. Throughout this year, we […]