Turning Around International Comparative Indicators

We have a habit of taking international comparisons of various aspects of higher education that are produced in—to put it gently—dubious ways, and delighting in our terrible and/or falling position. It’s time to cease and desist this self-flaggelatory habit. Even rhetorically, as a goad to improve, the statements have been uttered so often that they have lost all meaning and effectiveness. They are simply part of en empty liturgy.
It’s time, instead, to ask some serious questions about the type of indicators being produced and cited, the quality and presentation of these indicators, the taxonomies and aggregations on which they are based, their inclusions and exclusions and whether other indicators could guide us better.

It’s time, too, to put the stage lights on the critical background tapestry of demography, since comparisons of the progress of national systems absent trends in population is like flying blind.

We fly that way all the time.

So let’s start with demography—and your fourth-grade mathematics. When countries are compared in terms of participation in higher education, degree completion, and proportions of students following paths of science, technology, engineering and/or mathematics, the issue of supply—current and future—is critical to the interpretation of what we see. In case you haven’t noticed, by 2025, Japan is on track to lose a quarter of its youth population, Korea 20%, Russia a third, Poland close to 40%. The Japanese have already merged 11 universities in anticipation, the Russians have threatened regional universities with the same fate, and the Koreans are in line to close more than 100 postsecondary institutions.

Now for your fourth-grade math: What happens to a fraction, hence a percentage, when the denominator falls dramatically and the numerator declines as a more modest rate (oh, you need a demographic lesson here: the numerators of specific behaviors in a population always lag trends in the denominator)? The proportion of Japanese, Korean, Russian, etc. youth participating in higher education will rise to stratospheric levels without any system reforms whatsoever.

Note that, among the 31 OECD countries, only four show fertility rates at or greater than replacement, and net migration greater than 4%. The U.S. is one of them, and that one also shows the fourth-highest projected growth in the 25- to 34-year-old population to 2025 (and more than 75% of that growth will be Latino and Asian). Now, fourth-grade math again: What happens to a fraction, hence a percentage, when the denominator rises faster than the numerator? The proportion of our youth population participating in higher education will decline without any system reforms whatsoever.

Bottom line: In terms of population ratio indicators of participation in higher education, the U.S. is going to look worse in comparative data by 2025, no matter what we do. It’s all in the pipeline now. Comparative graduation rates will follow suit—and for the same reasons.

But let’s turn to what are known in international circles as “cohort survival rates.” We get these from OECD’s Education at a Glance in one of the most statistically outrageous tables (A4.1) you will ever read in a publication from a putatively respectable organization. What one sees here are what we would call “graduation rates” for 24 countries, and of course the U.S. looks terrible at 56%, compared to Finland at 72%, for example What OECD tells you only in an online “Annex” that nobody reads is that the U.S. rate is the only one of the 24 that is confined to graduating from the first institution of attendance, and that if a system graduation rate were used, ours would be 63%. OECD then tries to minimize this better news in the footnote by judging the starting date for our system completion data of 1995-96 to be “older,” even though the same starting years are OK for Denmark and Sweden in table A4.1. If I judged OECD’s presentation of U.S. data to be purposefully prejudicial, I would be kind.

Now, there are a lot of other problems with this table, but one leaps off the page right away: Nowhere does OECD tell the reader for how many years the student cohorts were tracked. It turns out that that wonderful 72% graduation rate for Finland is based on a 10-year tracking; the Netherlands’ 65% is based on seven years, as is France’s 64%. Our true matching percent of 63% is based on six years. Ask yourself a simple question: Are there any real differences in this beauty contest?

While the answer is a no-brainer, the more basic critique is that, by excluding the temporal reference point for “survival rate” this table is unacceptable for publication under anyone’s statistical standards! Of course, we don’t care because we love to be told how badly we do compared with others. I have never seen this table challenged in either public statements or the academic literature in the U.S.

Are there other, and potentially more enlightening comparative indicators addressing common challenges in higher education among advanced, postindustrial democracies? May I suggest two easy ones: inclusion and system flexibility. What do they mean?

We have different terms for “inclusion,” but it translates as participation in higher education of all segments of a population, if not in a direct percentage of their proportion among secondary school graduates, then as close as you can get. What populations? Not merely by race/ethnicity or family income (the reflex ways we deal with the question), but also by isolated rural, by combinations of community economy and housing stock, by disabilities, by parents’ highest level of education. All of these terms of analysis come from other countries, and it’s about time we tried some of them out. Geodemographic analysis will take us a long way to targeting populations: We will at least know where to drive our car when we go out to address the challenges. Inclusion is a policy objective of just about every OECD country.

System flexibility is related to inclusion because it translates into metrics of flow-volume: of students into and through higher education by nontraditional means and paths. Some of these are obvious; some less so: part-time status, on-line delivery, assessment of prior experiential learning, re-entry bridge programs for adults, better use of what other countries call “short-cycle” degrees (our associates) as system pivot points. Comparative indicators on this policy playing field will encourage sharing of strategies and interventions. That too will be a lot more productive than beauty contests.

_______________________________________________________________________

Cliff Adelman is a senior associate with the Institute for Higher Education Policy (IHEP) who served nearly 30 years as a senior research analyst at the U.S. Department of Education. Adelman’s monograph, “The Spaces Between Numbers: Getting International Data on Higher Education Straight,” can be found in both short and extended versions at www.ihep.org/research/GlobalPerformance.cfm


[ssba]

Comments are closed.