NEBHE’s upcoming annual Leadership Summit scheduled for this coming October poses the question, “How Employable Are New England’s College Graduates, and What Can Higher Education Do About It?”
The Summit will address numerous well-chosen, commonly current questions in and around this topic, predicated on the assertion that “New England employers consistently claim that they can’t find sufficient numbers of skilled workers—especially in key tech-intensive and growth-oriented industries like information technology, healthcare and advanced manufacturing.” The strategic questions are, “Is higher education to blame? Are our colleges and universities still operating in “old economy” modes, in terms of services, practices and strategies for preparing students for career transitions and employability?” And, “Can New England’s colleges and universities be the talent engine that they need and ought to be?”
The following addresses the factual premises, their historical context, and strategic issues, in a constructive attempt to clarify and enrich the discussion at the Summit.
First, as to the facts on employability: It is commonly believed, but incorrectly, that today’s college graduates have high unemployment; a recent study found that compared to other age and education cohorts, they actually have the lowest rate of unemployment—about 2%. Moreover, today’s job market operates on a new model of employment—the so-called “on-demand” or “gig” economy of short-term jobs perhaps interspersed with underemployment. The Summit needs to begin, therefore, with everyone on the same page with current data on unemployment, employment and underemployment.
As to history, this whole discussion arises from the confluence of two massive trends: the information technology (IT) revolution and the soaring, excessive costs of college matriculation.
The IT revolution, as we all know, is transforming all areas of life and enterprise, at an accelerating pace. It is now in what Steve Case, founder of AOL, calls its “Third Wave,” progressing rapidly from the “internet of things” to the “internet of everything.” As this revolution has gained speed and momentum, technological turnover has accelerated and pervaded job markets, so that everyone has now to run and jump to keep up with it. A large part of employers’ difficulty in filling jobs with suitably skilled employees is a side-effect that has become the new normal in high-tech businesses. That will not change, and cannot be blamed on colleges and universities; whether they can realistically be expected to do anything meaningful about keeping up with and advancing it is an open question.
The concurrent soaring college and university costs—and huge loans to help cover them—has made parents and students increasingly concerned about affordability, student indebtedness and practicality. This has had commercializing effects on college and university cultures, in which students and their parents consider themselves increasingly as consumers purchasing credentials for continued financial support and jobs. Simultaneous grade-inflation, reduction of onerous study workloads, anxiety over what professors want rather than what students should want for themselves, excessive grade-consciousness, and questioning whether the investment is worthwhile, often boils down to a vicious circle: whether the investment will lead to a steady job that will enable paying off the loans.
The combination of these two trends is the dangerous situation we have today. If the job market is in constant, rapid and accelerating turnover so that jobs and even careers become short-term investments in ephemeral results by both employers and employees—and if the culture of colleges and universities is commercialized, operating as an investment in job security—how can colleges and universities, as relatively sluggish institutions already behind the curve, possibly now be expected to provide rapid-turnover kinds of training for rapid-turnover jobs? Even if they succeed in training students for today’s job market, that same training will become obsolete tomorrow, and then what will the investment have been worth? How can New England’s colleges and universities, caught in this crunch, be presumed to have any real or viable “need” or obligation to be “the talent engine” for current or future job markets?
Here it is strategically useful to distinguish clearly between “education” and “training.” “Training” is “knowledge and skills development” and is the focus of this discussion; “education” is “self-development”, which is what our colleges and universities were created to do, as in the Classical tradition of liberal education. Education certainly includes training, but is both broader and deeper, intensely personal and social—focusing on the cultivation of values. Education is more about who, training is more about what, students are and will become in their subsequent lives and careers.
It has long been conventionally accepted that the mission of “higher” education in colleges and universities, as distinct from that in schools, is to bring training in disciplined scholarship to bear on the cultivation of personal values, as in liberal education. This is not something that goes in and out of fashion with changes in economies or technologies. While the training function needs to be currently in tune with useful knowledge and skills in fast-changing technology and job markets, the challenge of keeping au courant is real, but always subordinated to the permanent and characteristic mission of higher education.
Here, modern technology itself can help. Training these days is done most productively and efficiently by computers and the internet, as has been conclusively demonstrated by MOOCs. Obviously the employers who are complaining about the technical preparedness of prospective hires, know best what training (knowledge and skills) they want those new hires to have. They happen also, however, to be in the best position to provide it themselves. Case (incidentally, a graduate of Williams College), in his book The Third Wave, put it succinctly: Let higher education develop character—which, he advises, is what the innovating entrepreneurs should be looking for in hiring—and let the businesses then train for the special skills they currently and prospectively need. MOOC-style courses could be the instrument of choice for such training; highly flexible, cost-effective, and productive, they can be quickly developed by anyone for any subject and trainee population, at minimal costs, and readily superseded as needs change.
Can colleges and universities help address this employment problem generated by the technological revolution? Yes—they might at tolerable cost to themselves (perhaps supported by businesses who, after all, need the workforce), incentivize this training with (limited) credits toward degrees for online MOOC training; they might provide various certifications apart from degree credits for MOOC students. They might open room and board facilities to MOOC enrollees, especially in summer or other off-season months, at least partially supported by the businesses needing them. They might provide to MOOC trainees a range of supplementary educational support services by adjunct faculty. Adjuncts might assist with running MOOCs, and businesses might have their MOOC instructors appointed as adjunct members of the faculty, if the cost-sharing could be worked out.
There is a wide variety of facilitating and affiliating options for training, short of undertaking full responsibility. But in this whole context, the suggestion that New England’s colleges and universities should assume, or be expected to assume, responsibility for supplying technically prepared employees to businesses, is an idea that is close to absurd and dead on arrival.
George McCully is a former historian, professor and faculty dean at higher education institutions in the Northeast, then professional philanthropist and founder and CEO of the Catalogue for Philanthropy.
[ssba]