Academic Disciplines: Synthesis or Demise?

By George McCully

Industrial silos for refinery

Current anxiety over the values and directions of what we used to call “higher education” has rich and complex roots in the past, as well as problematic branches into the future. A crucial and core aspect of the subject not yet adequately understood is the structure and strategy of scholarship itself, and its future.
Forty-five years ago, in the heyday of “multiversities” lauded in books by presidents Clark Kerr (UC Berkeley) and James Perkins (Cornell), I wrote an article for the Journal of Higher Education entitled “Multiversity and University.” It contrasted the two models of scholarship, and contended that, whereas multiversity academic disciplines are each internally rigorous as scholarship, taken together as a putative whole, the multiversity had never been defended as scholarship and could not be so defended, because it is not scholarship. The disciplines arose and came together by historical accidents, not by intentional, systematic, scholarly or philosophical design.

They arose in the early modern period of Western history—the 15th to 18th centuries, with the Renaissance, Reformation, Scientific Revolution, Absolutism and Enlightenment—arguably the first “Age of Paradigm Shifts” in every field, significantly driven by Gutenberg’s IT revolution in printing. Each of the various modern disciplines created its own vocabulary and conceptualization, which were based on analyses of contemporary events and developments, and—this is crucial—were exclusively specialized. Scholarship is always necessarily specialized—it examines the world in detail. What is distinctively modern with the multiversity is that its specializations exclude other subjects—studying each one (e.g. economics, politics, astronomy) separately, to the exclusion of others, in various languages that are mutually incompatible and incommensurable. Collectively, modern academic disciplines imply that scholarship at its highest levels describes the world as if it were fragmented, in separate silos. This structure and strategy of knowledge, inquiry and education played a leading role in producing modern secular Western civilization.

Its long-term effects have been profound. Exclusive specialization was originally intended only to separate each field from religion in a period of religious wars. The cumulative effect—coincidentally and inadvertently—was that they also excluded each other, obviating our sense of reality as a coherent whole, which it actually is. This also gradually undermined authentic liberal education, which seeks self-development in wholeness of life. In the multiversity, “higher education”—advanced self-development—has devolved, as we see today, into advanced technical training—information and skills development. As such, it leads to lives fragmented accordingly—even divided against themselves. Translated into public policy in the real world, the disciplines’ exclusions feed back as problems—in the early ’70s Journal of Higher Education article the prime examples were our failures in Vietnam and the deepening ecological crisis caused by technology ignoring ecology. In sum, the flaws of fragmented scholarship have inclined us to problems at strategic levels in modern culture—in knowledge, education, public policy, and personal values—owing to the unattended gaps among the disciplines.

Needless to say however, the article’s fundamental critique raised no noticeable dust. Basically no one cared—in part, no doubt, because they had been trained not to care about the whole. But because the assertions were true, it should not be surprising that today we are compelled to return to the subject by a new set of historical circumstances and trends in this second Age of Paradigm Shifts, also propelled by an IT revolution—this time of computers and the internet.

It may help to recap the history of how and why the tradition of university or universal learning was superseded. Basically, medieval civilization broke apart as printing enabled the flood of new information in all fields in this period to be much more rapidly and broadly shared, so as to set new standards in the sciences and scholarship. The Reformation and Wars of Religion encouraged scholars and scientists to dissociate their work from the contending universal religious doctrines and authorities. The best-known examples are those of the Scientific Revolution in astronomy, physiology and other physical sciences, which became increasingly empiricist in protective isolation from Classical and Christian authorities and dogmas. The flood of biological discoveries from the New World, of flora and fauna previously unknown and thus without symbolic significances, freed “natural history” from medieval natural philosophy and theology. In the social sciences, Machiavelli gave birth to political science by asserting that the application of traditional Christian values to questions of “how to maintain the Statein Renaissance Italy would likely fail, so that to be successful rulers should focus exclusively on power relationships. Rampant monetary inflation spreading throughout Europe in the 16th century, initially thought to be caused by sinful covetousness, was shown by Sir Thomas Smith to result from the sudden huge influx into the European economy of gold and silver bullion from the New World. Juan Luis Vives, the Spanish humanist living in Northern Europe, pioneered modern sociology by analyzing permanent poverty in Bruges, modern psychology in his advocacy of women’s education, and a secular understanding of current events based on the Stoic categories of concord and discord. Humane letters addressed an increasingly bourgeois secular society, and rationalist and empiricist philosophy sought autonomous grounding. By the 18th-century Enlightenment, intellectuals were consciously seeking secular alternatives to medieval universal values based on theology. A symbolic example is that “philanthropy”—the “love of what it is to be human”—became a central value in ethics, especially in forward-looking Scotland and America.

The cumulative result of all these paradigm shifts was the disintegration of what had been a university encyclopedia (etymologically: encyclos paideia: “universal” or “all-embracing” learning) of scholarship and culture. The various disciplines, to their credit, were freshly and hugely productive; they gradually hardened and were drawn into academic institutions. By the end of the 19th century, they had become a standardized structure of separate parts with no integrating whole. To be sure, outside and on the periphery of academe, there were significant exceptions and even resistance to the disintegrating academic trend—by Alexander von Humboldt, George Perkins Marsh, Charles Darwin, Emerson, Thoreau, Poe, Henry Adams, William James, Ernst Haeckel and many others. The term “multiverse” was coined by Adams to describe the emerging pluralistic view of reality. By 1963, Clark Kerr coined the term “multiversity” to describe the heterogeneity of branches within single academic institutions, and lauded its intellectual dominance in American society. In 1966, James Perkins echoed his enthusiasm. The 1973 Journal of Higher Education article cited above was, therefore, a radically non-conforming view.

But the subsequent history of the multiversity has not been a continuing success. By the early ’90s, Jaroslav Pelikan’s The Idea of the University: A Reexamination asserted that colleges and universities were in “crisis.” The political and cultural turmoil in academe of the late-’60s and early ’70s rudely deposed both Kerr and Perkins. The business model of higher education became increasingly dysfunctional,[1] with runaway costs mainly for ballooning administrations, declines in public funding, inexorably growing reliance on underpaid “adjunct” faculty, a decline in tenured faculty ratios, and students graduating with enormous loan indebtedness. Students and their parents have become highly critical, seeing themselves as exploited consumers buying academic credentials on unfavorable terms for short-term, unreliable job markets. Thus, to the intellectually weak organization of learning is now added an institutionally and financially weak infrastructure, making the whole system more vulnerable in a rapidly transforming world. There is even evidence of increasing scholarly and professorial unease —e.g., the widespread increase in attempts to reconnect the disciplines in “interdisciplinary” and “multidisciplinary” studies; the AACU’s promotion of “integrative learning;” Northeastern University’s new “humanics” curriculum; Arizona State University’s experiments in replacing the academic departmental structure with integrative fields of study addressing real-world problems; and Georgetown University’s Center for New Designs in Learning and Scholarship, among others.

Moreover, six powerful factors—“conducive conditions”—fundamentally challenge today’s multiversity structure of academic scholarship:

First and most powerful is the continuing Information Technology (IT) revolution, which arose outside and independently of the multiversity in the late ‘90s, and has been transforming the content, management and communication of information in all fields. Because the multiversity consists of information and depends on its technology, scholarship and teaching are being thoroughly—broadly and deeply—affected.

Second is a part of that revolution, namely, the explosion of sheer data—recorded and collected facts—to be analyzed. The most prominent expression of this is so-called “Big Data”—datasets so large and complex that ordinary software, even massively parallel systems running on hundreds or even thousands of servers, cannot manage them. Over 94% of all data is now estimated to be stored digitally, much of it with open access, usable by anyone, anywhere, at virtually no cost. Adequate management will require and thus evoke new technology and methods of analysis, some already existing, more yet to be developed.

The data explosion is subversive of multiversity disciplines because it comes from, and is about, the real world, which is not divided into separate parts conforming to academe’s conventions. Big Data is not separated out into silos. When it becomes manageable with more powerful technology, the exclusionary fallacies of academic silos will be further illuminated, calling into question the entire multiversity structure. Professors will have to retool their work.

Third is personnel—the huge increase and surplus of qualified researchers forced to work outside academe. Doctoral degrees today far exceed academic and research job openings. Fewer than half of those earning science or engineering doctorates gain jobs directly using their training. In the most popular fields like biomedicine, fewer than one in six join a faculty or research staff. Every year, the market tightens while federal research grants are flat or declining. The American Academy of Arts and Sciences reports the same for the humanities—numbers of doctorates awarded rise annually, while numbers of job openings decline.

Fourth is a knowledge explosion produced by the first three factors. What do these highly trained and underemployed people do with their skills? Some find gainful semi-relevant employment in industries, which are outside academic disciplinary restrictions; many take advantage of computers and the internet to do independent research, translating data into knowledge, largely freed from academic constraints. The capacity of traditional paper-printing in books and periodicals has been far exceeded by qualified research, so the surplus finds expression in many forms on universally accessible and even peer-reviewed spaces on the infinitely capacious internet. The bottom line is that the total output of research from all practitioners, significantly empowered by the IT revolution, now far exceeds the capacity of our academic and commercial information infrastructure to absorb and use it, much less to govern its content and formats. A crisis in knowledge management has already begun.

Fifth, which might administer the coup de grace for the multiversity, is future IT. The successors to today’s digital computers are now being developed outside academe by leading global corporations and governments: “quantum computing”—computers with exponential processing power (“qubits”) that are already capable of operating 50,000 times faster than today’s equipment, and soon will reach 100,000. The new technology has already run two million quantum programs to test and write papers on theories that we never before had the processing power to prove. New machines create new fields, which are not retrofitted into academic departmental straitjackets, but are free to roam and graze among the masses of new Big Data, to solve real-world practical problems such as climate change and overpopulation. This will render exclusive specialization obsolete.

Sixth is the real-world environment of academic infrastructures, which is enhancing the power of the first five disruptive innovations. Our world is transforming at an accelerating pace propelled by developing technology. Higher education is more than ever held accountable to the outside world in today’s monetized consumer economy of academic accreditations for jobs to repay the loans that bought those credits in the first place. A telling example is the revolution in AI—artificial intelligence—that can already drive cars and trucks and make homes and other accessories “smart,” self-regulating and intercommunicating, and that will certainly transform higher education. IBM CEO Ginni Rometty predicts that all jobs will be augmented by AI, requiring constant new learning and adaptation by jobholders. Therefore what today’s students need is not just information transfer as in the traditional multiversity, but learning how to teach themselves, with online and accessible “lifelong learning systems,” enabling constant retraining and upgrading of knowledge and skills—even (best case scenario) self-development.

The disruptive innovation of Massive Open Online Courses (MOOCs) is no longer experimental; students can gain academic credits for approved courses taught by experts from anywhere in the world, both inside and outside academe. Some of those courses are organized by conventional disciplinary categories, but many are not; they address real-world subjects and are accorded academic credits for business reasons. Other innovations—e.g., experiential learning, civic engagement—are moving in the same direction, from inside the academy out into the real world, signaling that conventional academicist categories are increasingly felt to be unrealistic.

These six factors—the IT revolution, data explosion, researchers surplus, knowledge explosion, future technology and the transforming real-world environment of scholarship—are radically more powerful than their counterparts in the first, early-modern Age of Paradigm Shifts, to which the emerging disciplines were originally attuned. Ours is a second Age of Paradigm Shifts, powered by the second IT revolution. Scholars then were concerned with the Classical distinction of humans from animals; today we are concerned to distinguish humans from machines.

We know that technological revolutions are inexorable and unavoidable; they must be accommodated. The entire set of academic disciplines, describing the world in separate parts by exclusive specialization evoked by actual conditions in the early modern period, is now antiquated and needs to be transcended by another innovative set, similarly evoked. To be sure, traditional subjects still exist—economies, polities, societies, cultures, physical sciences, etc.—for which deep expertise is always needed, but they can no longer be considered autonomously. What needs to change are the interstices. We need now to describe the world systematically, as computers will press us to do, but in realistic terms as a coherent whole—which science assumes. We may also hope our new learning will be firmly humane, distinguishing us from our artificially and massively intelligent machines. Colleges and universities, which have a special commitment to human values, would do well to assume leadership roles in accomplishing this.

George McCully is a historian, former professor and faculty dean at higher education institutions in the Northeast, then professional philanthropist and founder and CEO of the Catalogue for Philanthropy.

 

 

[1] 1980-2017, combined tuition and fees at four-year public colleges increased by 319%; 2007-2016 state spending per student declined by 18%; administrative positions at colleges and universities 1993-2009 grew 60% (U.S. Department of Education), tenured faculty positions 6% (Bloomberg); average administrator salary is $90,760 a year; average faculty (NOT including adjuncts) $79,424, adjusted for inflation about same as in 1970.


[ssba]

Comments are closed.