Discussions of the problematic future of higher education were already an exploding industry before COVID-19, producing more to be read than anyone could possibly keep up with. Their main audience was academic administrators and a few faculty, worrying where their institutions and careers were headed, and wanting guidance in strategic decision-making—helping to identify not only where they actually were and were going, but also where they might want to go. Experiments were everywhere, momentous decisions were being made, and there were no signs of any problem-solving consensus.
Into that pre-coronavirus maelstrom came Bryan Alexander’s Academia Next: The Futures of Higher Education (Johns Hopkins UP, 2020). Alexander, whose doctorate is in English literature, took care to detail his qualifications and previous experience in futurist studies, and is described in the flyleaf as “an internationally known futurist, researcher, writer, speaker, consultant, and teacher,” currently senior scholar [adjunct] at Georgetown University, founder of the online “Future of Education Observatory” and author of The New Digital Storytelling: Creating Narratives with New Media, and Gearing Up for Learning Beyond K-12.
We note the plural “Futures,” which is commendable because Alexander addresses the wide variety of institutions, from major research universities and state university systems to community colleges and the full range of private liberal arts colleges, each group with its own distinctive future. Alexander’s stated preference for the word “forecast,” as with weather, over “prediction,” as with science, is also appropriate. The method and structure of his book is presented as conventional futurism: to identify “trends,” from them to artfully project multiple “scenarios,” from which to draw conclusions. This is clearly not science—but more about methodology to follow.
The strongest part of the book is the first, which exhaustively details “trends,” or more accurately “innovations,” for whether they are actually historical “trends” is not critically addressed. Moreover, nothing is said about the central issue of scholarship itself, widely recognized as being a major problem—for example, the obsolescence of traditional (mostly 19th century) multiversity academic disciplines in this century, and the innumerable searches for new strategies and structures. The temporal range of Alexander’s forecasting vision is short: 10 to 15 years, but even so, the imagined “Scenarios” section suffers from rhetorical excess and a lack of carefully analyzed pathways telling us how the innovations might become “trends,” and those might become “scenarios.” The weakest part is the last, purportedly on conclusions, but failing to connect today to tomorrow, or to reach any very helpful conclusions.
Subverting all this however are two fundamental flaws, which the book shares with conventional futurist methodology: first, its tacit assumption that historical change is a consistently evolutionary process; and second, the lack of a precise understanding of historical causation.
Futurist studies arose as a field in the last half of the 20th century, a relatively stable postwar historical period. Alexander’s assumptions reflect this: “In general, the future never wholly eradicates the past. Instead, the two intertwine and influence each other.” This approach is less well suited, and sometimes not suited at all, to periods of revolutionary change, especially if that is widespread and accelerating, as it is today.
Stable periods of history, whether in particular fields (e.g., sciences, technologies, business, scholarship, higher education, etc.) or in general, derive their order from paradigms, that is, established models governing mature fields of activity. Revolutionary change occurs when a paradigm is overthrown or replaced by unordained means, producing an alternative, incompatible one—in politics for example by an unconstitutional change in the constitution of a polity. This distinctive kind of historical change—”paradigm shift”—usually concerns only individual fields, but in the 21st century we happen to be living in a highly exceptional entire period of paradigm shifts, powered by the revolution in information technology (IT)—computers and the internet. In higher education, paradigms were shifting even before the pandemic, already invalidating forecasts.
Periods of paradigm shift
Periods of paradigm shift are rare—by my count only four in 2,500 years of Western history. The first was the rise of Classical Western civilization itself, extending roughly from Periclean Athens to the fall of Rome—about 1,000 years. The second was the rise of medieval Christian civilization extending from there to the Renaissance and Reformation—another 1,000 years. The third was the “early-modern” period from the Renaissance to the Enlightenment (also incidentally driven by an IT revolution—Gutenberg’s), including the scientific revolution, global discoveries, the emergence of nation-states and secularization—about 300 years, codified by the familiar 19th century formulation that Western history had three main periods: ancient, medieval and modern.
Today, however, we are entering a fourth great period—signaled by the ubiquity of paradigm shifts and the fundamental issues it is raising, for example, with AI, robots and what it is to be human. The character of our new age is not yet defined as it is still taking shape, but it may become relatively established in only decades, owing to the vastly increased and accelerating power of technology. In short, even before the pandemic, higher education as an emphatically information-intensive field was undergoing its own IT-revolutionary paradigm shifts, amid other paradigm shifts all around it. For such a period, conventional futurist methodology and forecasting are not well suited; Anderson’s book is unaware of all this.
Causation: how it works
A second fundamental flaw is revealed by the book’s tendency to skip over transitional processes—how innovations become trends, trends yield scenarios, and scenarios reach conclusions. We are not told how these happen, or how they work as historical bridges. Nor are the transitions informed by any disciplined understanding of causation, both as a phenomenon and as an instrument of influence or management. The lack of thought about causation is understandable because it is common even among historians, who tend to be more empirical than theoretical because history is so complex. Nonetheless, a deeper and more precise understanding may clarify this discussion.
Consider: Everything and everybody in the world is an element in history—participating in events and developments, what historians study. Each is defined by a limited range of possible roles or activities, to which it is inclined to be conducive, exerting influence. Chairs, tables, boats, tools, chickens, etc., are known by us according to what they are and do, both actually and potentially. They both exist and are potentially conducive to qualifying or influencing their circumstances in the world around them.
Combinations of historical elements therefore also have limited ranges of mutual cooperation—where their respective potentials and influences overlap, and to which they are mutually conducive. Mutual influences—alliances, collaborations, cooperations—are generally more powerful than individual influence. People and institutions are more powerful together than apart. A chair and table in the same room with a person are more likely to be used together than separately or not at all.
Therefore when combinations occur in time and place, the probabilities that their mutual influences will actually happen increase, other things being equal. This is significant for leadership and management, because it means that by intentionally combining elements—”piling up the conducives”—we can increase our influence on events, promoting and helping to cause certain intended results to happen.
Causation in history may therefore be defined as the “coincidence of conducive conditions”, which produces the result studied or sought.
There are several fairly obvious caveats, however: a) elements and combinations vary in power and potential; and b) elements and combinations thereof can be partially or totally opposed to each other as well as mutually reinforcing. History and its study are extremely complex.
Therefore every historical event or development results from complex combinations of influential factors—causes, qualifiers and impediments. Historians identify and describe the activities and influences of various factors in order to illuminate and explain how events and developments happened. Planners, strategists and managers can likewise identify and use the relevant factors, to make desired events happen, to produce desired results—piling up the conducives and qualifiers, and eliminating, neutralizing, or avoiding the impediments, while ignoring the immaterial. Current events in our country and in higher education offer rich examples for this.
In the midst of one or more paradigm shifts, strategic and tactical planning are further complicated by the fact that the normal processes of change are themselves being violated—avoided, transformed and superseded. Thomas Kuhn, who coined the idea with reference to the Copernican Revolution in science, believed that the results of paradigm shifts are impossible to predict until late in the process—often too late for management. We should also acknowledge that the complexity of history has not yet been reduced to systematic scientific understanding; the study and understanding of history is still more an art than a science.
Higher education in crisis
But now let us consider the already deeply problematic crisis of early 21st century higher education, into which came coronavirus—a universal disrupter par excellence, leaving no institution or custom unchanged, imposing radical doubts about the future, and in particular forcing re-inventions of traditional practices under new and still unsettled current and future constraints.
There is a key difference between the pre-corona paradigm shifts and those imposed by COVID-19: Whereas the former are a reconstructive phenomenon, driven by the overwhelming power of the IT revolution in every information-intensive field, COVID-19 is an entirely destructive phenomenon, offering no constructive alternative to its victims. What happens when two transformative “conducives”—one constructive, one destructive—collide, especially in an age of paradigm shifts?
So far, the combined effects have been mixed—containing both constructive and destructive parts, as the two forces increasingly coincide. Certainly the rapid and forceful push of often-recalcitrant faculty into socially distanced online instruction is an acceleration of a clearly developing trend under the new IT; but as its effects ramify throughout the problematic business models, residential systems, admissions processes, courses, curricula and even architecture, of diverse colleges and universities, academic administrators have no reliable idea yet what or how viable new institutions might rise from the rubble.
Education vs. training
We need to be clearer than we have been about what values and issues are at stake. Not so long ago, back in the day when I was a student, we had a clear distinction between “education” and “training.” The former referred to the ancient tradition of liberal education, whose focus was self-development, for human fulfillment. Training, by contrast, was the development of technical knowledge and skills, with a focus on professional employment. “Higher education” came after school education, to prepare students for who they would become as human beings in later life; training prepared students for what they would become professionally in jobs and careers—what occupational and societal roles they would play. Undergraduate years were to be devoted to “higher education” and postgraduate studies to focus on professional technical training—law, medicine, architecture, business, research, teaching, etc.
That paradigmatic distinction and practice has obviously been blurred since then by commercialization. Soaring tuition costs and student loan indebtedness tied ever more closely to preparation for future jobs and problematic careers in an increasingly “gig” economy, have forced the flow of student enrollments and funding away from liberal education and the humanities to more immediately practical and materialistic courses, disciplines, curricula and faculty jobs. This has led students and their parents to see themselves as retail consumers, calculating cost-effectiveness and monetary return-on-investment in the training marketplace. Terminology has followed, so that gradually “higher education” and “training” have become virtually synonymous, with training dominant.
The forced mass movement to online learning and teaching involves radically different participation, financing and business models. It is increasingly clear that their concurrence and connection with artificial intelligence, big data and the gig economy—and with course offerings often segmented for practical convenience—has been building an extremely powerful “coincidence of conducives” that might complete the transit from education to training that has been going on for the last half-century. If so, this could spell for all practical purposes an end to higher education for all but a few very wealthy institutions, administering the coup de grace to the moribund traditions of higher education.
This paradigm shift has operated to the detriment of both education and training, but more dangerously for education. Recent surveys have shown that from 2013 to 2019, the portion of adults regarding college education as “very important” has declined from 70% to 51%; a majority of younger adults ages 18 to 29 now consider getting a job to be the primary purpose of earning a college degree, and they, who are purportedly its beneficiaries, are also the most likely to question its value. Moreover, because online instruction is more readily suited for training than for education, institutions of higher education face stiff competition in credentialing for jobs from specialized for-profit corporations and employers themselves, in effect shoving colleges and universities aside, rendering their dominance in the crucial years of early adult maturation superfluous and obsolete.
In short, the “coincidence of conducive conditions” for the demise of what used to be called “higher education” is now actively in place, and with the power of the pandemic behind it, the timing is ripe. Reversal is now impossible. We need to ask whether survival is still possible, and if so, how to cause it—how to identify and mobilize sufficient counter-conducives and qualifiers at least to avoid destruction and to achieve some sort of synthesis of both training and education.
The range of possibilities and probabilities is huge, far wider than can be summarized here. But one strategic possibility might be opportunistically to take advantage of the universal disruptive flux as opening up previously foreclosed possibilities—specifically, to reinstitute the traditional distinction between training and education and to combine both at the college level in courses and curricula. The value of the traditional definitions is that they constitute an inevitable complementary and mutually reinforcing bonded pair—developing both who and what students will necessarily become for the rest of their lives. How to combine them will be an unavoidable faculty responsibility, empowered and reinforced by administrative reforms in financial and business models. The result will constitute a radical re-invention of colleges and universities, featuring a rebirth, at long last, of humanistic higher education.
George McCully is a historian, former professor and faculty dean at higher education institutions in the Northeast, then professional philanthropist and founder and CEO of the Catalogue for Philanthropy.
Other NEJHE Posts by this Author: