Cracks in the bedrock: Can U.S. higher education remain number one?
Clara M. Lovett
The 20th century marked the ascendancy of American higher education to first place in the world. What made it so was a unique convergence of excellence in teaching and research, access for large numbers of students regardless of income or social class, and institutional pluralism. The strength and flexibility of U.S. higher education played a key role in the successes of what historians have called “the American century.”
Graduates of American research universities were at the forefront of the exploration of space, the conquest of disease, the green revolution, and the computer revolution. Graduates of hundreds of colleges provided the skills to build the largest and strongest economic system in the world. Others provided the intellectual and moral leadership for revolutionary advances in civil rights, public policy, and the arts.
At the beginning of the 21st century, however, this bedrock of American achievement reveals alarming cracks. To preserve the elements that made American higher education so successful in the 20th century-academic excellence, access, and institutional pluralism-we will need to apply the best remedies available in the decades ahead. For the cracks in the foundation are not superficial. They are getting ever larger as the tectonic plates of our society continue to shift underneath the foundation.
Many stresses have grown out of the increasingly mass nature of higher education. As recently as the 1 970s, graduation from high school was considered the educational high point, and the end point, for most young men and women. Today, we expect all young people to graduate from high school (though reality still lags behind expectations, particularly for members of minority groups). Graduation from a two-year vocational or technical college was an end point for adults in need of job retraining or an upgrading of skills. Today, the community colleges still perform that function, but they have also become critical gateways to baccalaureate education. The baccalaureate degree used to be the educational end point for a privileged elite. Today, in some states most high school graduates can expect to earn a baccalaureate degree at some point in their lives. And advanced degrees have replaced the baccalaureate as gateways to leadership positions in most professions.
In the last quarter of the 20th century, several economic and cultural trends converged to generate greater demand for postsecondary education. In some sectors of the economy- especially manufacturing, energy, and communications- technological innovations improved productivity and quality, but also required a more skilled workforce. In all sectors, employers began to rely more heavily than in the past on formal educational credentials as a short cut to hiring from large applicant pools and sometimes also as a shield against legal challenges to their decisions.
In the 21st century, formal education at all levels plays a role in our society and economy comparable to the role of roads and waterways in the early 19th century, transcontinental railroads after the Civil War, and interstate highways and airports after World War II. Education has always been and remains today a powerful way to shape minds and improve the quality of individual lives. But it has also become an essential part of the infrastructure necessary for collective progress. The changed expectations of millions of citizens in turn are changing the meaning and definition of educational credentials. The respective responsibilities of high schools, community colleges, and universities also are changing dramatically, requiring a re-examination of education policies and underlying funding patterns.
In almost every state, public colleges and universities are becoming deadlocked in the perennial and stultifying pursuit of operating and capital funds to meet the increasing demands on them. They look to local and state governments for relief from the fiscal pressures they experience. Much has been written about these pressures. They result from the convergence, not unique to higher education, of the rising cost of preparing skilled and specialized personnel, the need to develop and maintain essential infrastructure, and the increasing expectations of students and communities. Most of the time, public relief from these pressures is not forthcoming because governments themselves are struggling to meet rising expectations and greater obligations in other sectors, especially health care, even while yielding to public sentiment against tax increases.
The largest and best-known public universities seek relief in successful fund-raising campaigns. But for every one public university that can mount a billion-dollar campaign, there are scores of others that must set more modest goals. Hundreds of public two- and four-year colleges cannot play in this arena at all.
Unable to cope with fiscal pressures and frustrated by lack of positive responses from elected officials, many public-sector presidents and chancellors cite the obligation of students and their families “to do more for their education.” That is, these leaders call for higher tuition as a means of protecting at least one element of higher education’s foundation-academic excellence. These calls come from the heart, out of genuine concern that chronic fiscal problems are eroding the quality of teaching and research. Unfortunately, increases in tuition and other charges undermine another element of the foundation, access for lower- and middle-income students.
The situation is different, but not dramatically different, in the private sector of higher education. Most private institutions have reached, or are about to reach, their “price elasticity threshold.” Like their public counterparts, they have reason to worry about lowered academic quality and diminished access if they cannot continue to raise tuition. They are also especially vulnerable to the erosion and potential loss of institutional pluralism, the third crucial element in higher education’s success in the 20th century.
In the short term, the wealthiest private institutions are able to avoid these dangers. Some are steadily increasing tuition, even if it means sacrificing the social or ethnic diversity of their student bodies. A few are choosing to tap their large endowments to hold the line on tuition for all applicants and subsidize the neediest ones. Most private colleges and universities do not have the luxury of large applicant pools or large endowments. Yet the private sector’s tuition-dependent “have-nots” are precisely the institutions that sustain America’s educational pluralism. They are the church-affiliated, the historically black, the single-sex, the experimental colleges, the specialized graduate schools, and the like. The deterioration of many such institutions would impoverish our higher education system and with it the pluralistic society it serves.
Another threat to higher education’s foundation emerged in the 1980s, but went unnoticed, at least initially. The threat came from the reluctance of traditional colleges and universities, both public and private, to accommodate the needs and preferences of new, nontraditional participants in higher education. The prosperity of the 1990s masked the depth and breadth of this problem. Well-employed adult students, or in some cases their employers, stepped into the gap and chose to pay relatively high tuition to new institutions able to provide good instruction and mentoring, along with convenient schedules. The University of Phoenix and other nontraditional institutions recruited working adults as students, making sure they did not have to leave well-paying jobs to attend college.
Through the 1990s, the success of these new educational providers provided a safety valve for traditional colleges and universities. It relieved enrollment pressures at some campuses; it sheltered many more colleges from politically unpopular reorganizations of schedules or programs. It certainly added a new twist to the tradition of institutional pluralism. An economic downturn, however, might quickly turn the safety valve into another pressure point. This might happen if large numbers of students now served by institutions like the University of Phoenix returned to lower-cost traditional institutions, many of which are ill-prepared to meet their expectations.
The challenges we face today result from our collective successes since the end of World War II. Especially in the past 30 years, we have experienced what happens in most societies when a product, service, or institution originally created for the benefit of an elite group becomes sought after by much larger groups of people. Historically, the transformation has led to one of two possible outcomes. In some cases, over a period of time the elite chose voluntarily to abandon products, services, or customs that became available to “the common people.” Such was the case, for instance, for high-heeled shoes, once a status symbol for men and women of the court in 17th-century France. The product fell out of favor when members of the bourgeoisie began to copy the once-exclusive patterns.
In other cases, the elite continue to use a product or service used by larger groups, provided special luxury options remain available. For instance, in early 20th century America, mass marketing of Henry Ford’s Model T did not end the romance of the elite with automobiles. Instead, the automobile’s social status was maintained through a plethora of luxury options offered by producers. So also with higher education. In the 1970s, our society moved away from the notion that a college education was a hallmark of elite social and economic status. The system expanded to accommodate millions of students from varied backgrounds. But “luxury options,” distinguishable by high admission standards and high tuition, remained available alongside the educational Model Ts.
The democratization of higher education, however, required more than voluntary choices or market options. It required direct or indirect public subsidies to institutions and students. For the past 30 years, campus leaders and policymakers have debated the appropriate forms and the extent of public subsidies, generally concluding that subsidies are a good and necessary thing and that they should be larger. Today, however, the state of public finances at all levels, the widening gap between the cost of attendance and the means of lower-income families, and the needs of lifelong learners suggest that it is time to change the terms of the debate. It is time to focus on the resources we have, including public funds for instruction, research, and financial aid, and to rebuild the foundation of our higher education system on different premises.
Viewed historically and also in light of organizational theory, our higher education system is arranged along a continuum, with a cluster of cottage industries at one end and a cluster of large industrial-age enterprises at the other. The system is basically a hybrid of these two historical traditions. Higher education’s information technology infrastructure, designed to communicate with colleagues and students, teach courses online, or do research, is impressive, but still peripheral to the functioning of the system. To rebuild a foundation that will assure academic quality and access and protect institutional pluralism, we need to find ways to transform the hybrid into an information-age system.
It is easy to identify and explain those characteristics of our higher education system that can be traced back to cottage industries in the pre-industrial age. Even in the largest and most complex universities, the “master craftsmen,” such as the senior tenured faculty, control the processes of knowledge production and dissemination–in the jargon of organizational theorists, “the core functions” of higher education.
In recent years, consumerism has made inroads into the culture of many campuses, sometimes limiting the freedom of the master craftsmen to shape the enterprise. But the basic processes are still in place, supported by academic tradition and often protected also by statutory or contractual arrangements. At the smaller undergraduate institutions and in some graduate schools, the pre-industrial system continues in the relationship between masters and apprentices. Where classes are small and opportunities for teamwork abound, students still learn at the feet of the masters. This tradition has a powerful hold on higher education, as anyone who has browsed through glossy print catalogs–and more recently, videos and Web pages–can attest.
In recent decades, many institutions deviated from this cherished ideal, but did not abandon it. In the public sector, political pressures made it impossible, or at least unwise, to cap enrollments or adopt stringent admission requirements. In the private sector, all but the wealthiest institutions grew in size and diversified their missions in order to cover rising costs. Incrementally, and mostly against the wishes of the master craftsmen, institutions adopted some of the characteristics of industrial-age production.
Standardized curricula, large lecture classes, graduation requirements based on seat time and credit hours completed rather than on the mastery of subjects and tasks–all these were rational responses to large enrollments. However, these responses to increasing enrollments were never legitimized by campus leaders, governing boards, or funding agencies. Rather, they were portrayed as necessary evils, temporary fixes to get institutions through periods of rapid growth or financial stringency while awaiting the return of good times. In this climate of self-deception and inertia, most colleges and universities passed up opportunities to capitalize on the positive aspects of industrial-age production. For instance, instead of using the master craftsmen to create high-quality departmental or college-wide curricula and then supervising the pedagogical training of their apprentices (junior faculty, part-time instructors, and graduate assistants), most institutions left the apprentices in charge of interpreting cours e requirements and teaching undergraduates.
Institutions continued to offer as many programs and courses as possible, with the honorable intent of pleasing faculty and students, but with little attention to effective use of faculty or to weeding out egregious redundancies across the curriculum. Some years ago, for instance, a business dean who was preparing his college for accreditation by the American Association of Colleges and Schools of Business, showed me the results of his analysis of all course syllabi and textbooks used by his faculty. By a conservative estimate, most topics were repeated two or three times across the required core courses.
Little effort was made to find partners that offered complementary program strengths. Even within “systems” governed by one board of trustees, individual campuses resisted integration and collaboration in favor of self-contained development. In doing so, colleges and universities protected to some extent the status and prerogatives of their own master craftsmen. They failed, however, to focus their resources around those programs and activities in which they could excel. They ended up with the worst features of mass production without reaping the benefits of quality and cost control.
Among the sectors of our society and economy, agriculture most closely resembles higher education’s hybrid form of organization. In agriculture, however, family farms-the cottage industry equivalents-are rapidly disappearing. Industrial-age production based on large vertically and horizontally linked enterprises is clearly winning the day. At the close of the 20th century, the ascendancy of public “megaversities” led some observers to believe that higher education would inevitably go the way of agriculture. Currently, however, a more valid organizational comparison might be drawn between higher education and the restaurant industry.
The pre-industrial aspects of restaurant management are well-known and often romanticized, especially in the image of the creative and despotic master chef surrounded by a gaggle of apprentices and servants. That sector of the industry, however, survives by catering to wealthy customers. Another sector specializes in serving the mass market, at much lower cost and perhaps more efficiently. Restaurant chains, classic examples of industrial-age production, turn out meals of predictable, even quality, available at thousands of locations at affordable prices. Each sector of the restaurant industry meets market needs, but no one would argue that they offer the same dining experience.
A very similar situation has existed in higher education for several decades. “Boutique” colleges–selective, small, and expensive–continue to thrive today as they did when only the elite attended college. As a society, we did not question the right of these institutions to pursue their missions and to charge whatever their markets would bear. But we worried about the prospects for less wealthy or less well-prepared students, for time-bound and place-bound students with limited choices, for minorities once shut out of the system.
With the expansion of higher education enrollments in the 1970s, we resigned ourselves to think that quality was for those who could afford elite colleges; everyone else might find a place somewhere in the system, but would have to settle for the educational equivalent of Pizza Hut. It was difficult to find a middle ground between the high-quality, customized experiences offered by educational cottage industries and the cheaper, mass-produced experiences offered by educational factories.
The situation could be very different today. Provided we have the will to embrace change, we have knowledge and resources never before available that can enable us to avoid this Hobsonian choice between quality and access, a choice alien to our belief in an intellectual meritocracy and to our democratic political culture. We can work to transform our colleges and universities into the best possible information-age organizations, transcending both our roots in the pre-industrial age and our half-hearted adaptations of factory-style production.
Successful information-age organizations, whether corporate or not-for-profit, can be identified by how they set their goals and how they use technology. These organizations identify a limited number of products or services (“core functions,” in the language of business) that set them apart from market competitors or other organizations, and they focus their resources on the pursuit of excellence in those areas. While restricting, by design, the range of activities they undertake, these organizations try to expand their ability to reach potential customers or clients anywhere. A robust technological infrastructure is employed in all aspects of their work, from internal and external communication to production of goods or delivery of services.
What might such a transformation look like in higher education? Individual campuses would no longer try to be all things to all people. In urban areas of the country, consortial arrangements could enable each participating campus to earn distinction (and attract enrollments) in selected program areas. To some extent, this is already happening, for instance, among the senior colleges of the City University of New York and between public and private colleges in several other urban areas around the country. Urban consortia could offer degree programs in traditional formats, for students able to live on or commute to a campus, as well as online for time- and placebound individuals. In rural areas, where consortial arrangements might not work, institutions also could develop selected program strengths while serving as portals and points of contact for online programs originating elsewhere.
The use of consortia, where possible, and of the online capabilities already available at many institutions, could lead to improved program quality, more creative use of personnel, especially faculty, and some savings in the cost of support services, from procurement and maintenance to libraries. However, this approach does require campuses to abandon customs and practices dating back to the pre-industrial age, especially the practice of doing everything in-house.
In the past 20 years, colleges and universities have formed partnerships with their peers and with corporations to perform support functions, such as building maintenance or campus dining services. Today, much greater resources are available for the asking, and not only in support services. Within a year, for instance, science and engineering course content will be available universally and at no charge from the Massachusetts Institute of Technology, through a bold new initiative, Open Course Ware (OCW), funded by the Hewlett and Mellon Foundations. Access to such content will not entitle users to earn MIT degree credits. And, initially at least, the interactive features of OCW will be available only to campus-based students. However, the ability to access course content developed by world-class experts potentially can enrich the learning opportunities for millions of students at all types of institutions well beyond the boundaries of the United States.
Especially if other elite universities choose to follow MIT’s example, initiatives like OCW have tremendous potential to help institutions and their faculties. As the new resources become available, campus leaders and their governing boards will bear the responsibility of deciding how and when to use them. They might, for instance, decide to use and adapt high-quality course content and materials available online at little or no cost, instead of doing all the work in-house. This might be a good way to enable full-time faculty to spend more time teaching and tutoring students, thus enhancing academic quality and at the same time reducing the need for temporary or part-time faculty.
As information-age organizations, colleges and universities could also free themselves from the least attractive aspects of their industrial-age past. On most campuses, automation of non-instructional functions, such as course registration and financial transactions, already offers users convenience and predictable quality. More extensive use of information technology to provide students with asynchronous access to course content, exercises, simulations, and so forth would relieve faculty of the burden of teaching large classes and, again, allow more time for tutorial or small-group interactions.
Finally, colleges and universities could move decisively to dismantle industrial-age, Taylorite measures of time-on-task and productivity, such as semesters, credit hours, and graduation requirements that are only marginally related to mastery of subject matter and its applications. These late-19th-century constructs could be replaced by flexible entrance and exit procedures and by competency assessments available to students anytime their mentors judged them ready to be assessed. Validated assessments are already used in professional fields of study, such as nursing, accountancy, and engineering, because they are necessary steps on the road to licensure. Boards of education in many states are developing similar assessments for people entering the teaching profession. Assessments of general education/liberal arts competencies are less commonly used, but they have been developed by a number of traditional colleges and by nontraditional institutions like the British Open University and Western Governors Univer sity.
The transition from status quo to information-age organizations would benefit higher education by enhancing quality, expanding access, and protecting pluralism. It would not eliminate the need for public subsidies of institutions and students. The transition, however, would require colleges and universities to make a much sharper distinction than they make now between their core functions and the constellation of ancillary functions now encrusting the core like barnacles on the hull of a ship. As elements of this country’s 21st–century infrastructure, higher education’s core functions–teaching and research–have a compelling claim to support from the public purse. This is not the case for services and amenities that students and communities might access off-campus or electronically, such as full-service campus health clinics, entertainment events for the general public, athletics, and financial and legal counseling.
The application of organizational theories and information technologies largely invented on American campuses in the 20th century now affords us the opportunity to rebuild the foundation for higher education. But who will lead higher education through the necessary transformation?
A senior executive recruiter recently told me, “No institution can rise above its board.” I believe that he is right and that the observation holds true for systems, not just individual campuses. The main responsibility for leading higher education’s transition to new ways of producing and disseminating knowledge lies with this country’s boards of trustees. They need to understand that the cracks in higher education’s foundation are deep, that the ground has been shifting underneath the foundation, that the old approaches to funding problems (such as begging and cajoling the politicians, soaking the students, and buying fewer paper clips) no longer work.
In the private sector, trustee leadership can facilitate the creation of collaboratives and consortia for the sharing of faculty and physical resources to an extent that was unheard of, indeed impossible, before the diffusion of information technology. For example, colleges and universities bound together by the Catholic intellectual tradition could also be bound together by shared academic programs and support services, regardless of geographical location or size. Initiatives of this type would obviously benefit the most financially vulnerable colleges, thus protecting institutional pluralism, but would also help the stronger ones, by allowing them to specialize and focus their program strengths.
In the public sector, it is imperative that system boards be willing to manage their systems rather than look the other way as every campus grows up to resemble every other campus. Trustees have a responsibility to help campus and external communities understand the challenge of transformation, especially the importance of focusing resources and efforts on core functions and of capitalizing on opportunities for online learning and shared programs. Most importantly, of course, trustees can make their mark by choosing and supporting campus and system leaders who understand what needs to be done. Finding those leaders may turn out to be the hardest part of a hard job.
Clara M. Lovett is president emerita of Northern Arizona University. She served as president from 1994 to 2001 and previously served as provost at George Mason University and dean of Arts and Sciences at George Washington University.
COPYRIGHT 2002 Heldref Publications
COPYRIGHT 2002 Gale Group