Great expectations: content, communications, productivity, and the role of information technology in higher education – Cover Story
Kenneth C. Green
Are you old enough to remember the slide rule? If handed one today, could you use it to perform simple math problems, let alone complex calculations?
Early in the 1970s, calculators replaced slide rules as the math tool of choice for scientists and engineers. Calculators are more accurate, easier to use, and generally do far more than slide rules. After a few years on the market, calculators also were less expensive than slide rules. Some instructional time was saved because it was easier for students to learn to use calculators than slide rules. For colleges and universities, the transition costs from slide rule to calculator were low because students purchased their own. And the transition decision was simple: a newer, more effective technology replaced an older – now less useful – tool. It was not necessary to conduct time-consuming and expensive cost-benefit analyses or learning-outcome evaluations of slide rules versus calculators.
Calculators – both the ubiquitous inexpensive products and the high-end, programmable models widely used by scientists, engineers, and financial analysts – provide an interesting case study of a compelling technology that helped change the way many professionals work. Moreover, this technology, by general consensus, helped its users become “more productive”: the personal productivity value of the calculator was readily apparent. However, this technology changed only a small part of what colleges taught and how they taught it; indeed, the changes applied primarily to specific disciplines, not the whole academic enterprise. Advocates did not believe or claim that higher education as a whole, or individual colleges or universities, had become more “productive” because more students and faculty were using calculators.
The case study of the calculator suggests several major points about the integration of technology in education.
First, the most compelling technological innovations do not require extensive analysis or evaluation before they become widely adopted and integrated into academic work. The calculator, and especially the programmable calculator, was just such a compelling technological innovation.
Second, the universe of beneficiaries must be carefully identified in any and all discussions about impacts, including productivity. For example, a new technology may offer significant productivity gains for individual students or faculty without affecting institutional productivity; alternatively, technology may be used to increase the productivity of administrative operations without having any impact on instruction.
Third, compelling technology may – or may not – have dramatic consequences for the curriculum. Neither calculator advocates nor calculator vendors made great claims for how these products would change engineering education. But, clearly, the programmable calculator has contributed to some important changes in mathematics, engineering, and technical education over the past two decades. Not all of these changes were initially identified or anticipated, and not all were perceived as beneficial, such as the decline in estimation skills that occurred as calculators replaced slide rules. (Similarly, the computer spreadsheet also has had important impacts on management education, but advocates made no such great claims. Here too, many of the impacts and benefits were not initially identified or anticipated.)
Finally, the experience with the calculator is unusual. The slide rule – like the book, blackboard, and lecture – was an accepted standard. But unlike the book, blackboard, and lecture, the slide rule was quickly replaced by compelling new technology – the calculator. There are few, if any, similar examples, even given the great claims for computers, video, and information technology today. For better or worse, the book, blackboard, and lecture continue to dominate instruction.
Education always seems attracted by the light – by the promise and potential – of technology. From film in the ’20s to television in the late ’50s, computers in the ’80s, and now information technology in the ’90s, there have always been great expectations that new technologies would soon enhance learning and instruction.
In the ’80s, during the much-discussed “microcomputer revolution” in higher education, the computer emerged as a personal tool for writing in all disciplines, financial analysis in business, statistical applications in the social sciences, etc. Students, faculty, and institutions purchased desktop systems by the truckload. Engaging applications (graphics, digital imaging, desktop publishing, electronic mall, multimedia), falling prices, and increased power and convenience brought the desktop and notebook computer to thousands of academics who never previously thought of themselves as “computer users.” Most would agree that modest productivity benefits emerged as growing numbers of faculty transferred much of their work from secretaries, mainframes, and minicomputers to desktop systems and word processors.
Midway through the ’90s, however, colleges and universities confront a second major phase of this “revolution” – a shift in emphasis from the computer as a desktop tool to the computer as the communications gateway to colleagues and “content” (databases, image and text libraries, video, and more) made increasingly accessible via computer networks. Technology advocates are fond of describing a future “information-rich” environment that will support instructional and scholarly activities in new and exciting ways.
At the same time, the rising financial pressures confronting higher education also have focused attention on the promise of technology to improve productivity in higher education. The stated hope is that computing and information technologies will yield new levels of institutional and instructional “productivity.” The stated expectation is that the infusion or integration of new technologies into instruction will, at minimum, maintain and ideally enhance student learning while significantly reducing instructional costs.
But will information technology lead to the kinds of productivity gains and associated cost savings touted by its most ardent advocates? Alas, not soon, we conclude, and certainly not soon enough for the parties eager to control instructional costs or for the evangelists who promise that information technology will revolutionize learning.
However, careful review suggests that we may expect major, substantive benefits from more widespread academic uses of information technologies – in the areas of content, curriculum, and pedagogy. Further, the demands and expectations of students and faculty for information technology are increasing the pressure on colleges to make it readily available. Finally, changing demographics already have increased the number of students who can benefit from new “distance education” applications of technology.
Reduced to the core issue, the “technology yields instructional productivity” advocates are eager to demonstrate that information technology will a) allow the same number of faculty to teach more students at the current (or at an enhanced) level of learning or b) allow campuses to serve the same number of students with fewer faculty and with no loss in learning (either what is learned or the number of students who learn it).
Clearly, technology has brought both enhanced productivity and reduced costs to some parts of higher education. Like many corporations, campuses routinely and effectively use technology in many administrative areas. As in the corporate domain, computers have improved productivity related to a wide range of data management and transaction processing activities: personnel files, course schedules, library catalogs, budgets and accounts receivable, student transcripts, and admissions information. Moreover, in some parts of the faculty domain, technology has truly helped to increase productivity and reduce operating costs. Indeed, a generation of faculty have come into academic positions with little or no secretarial assistance from their departments or institutions: they have computers to prepare their own class materials, course syllabi, conference papers, grant proposals, manuscripts, and other documents. As yet, however, relatively few would claim – even after a dozen years into the “micro” revolution – any real gains in instructional productivity. In that realm, as ever, we’re still left with the “promise” of technology.
To understand why technology offers more potential than promise in education, it’s instructive to examine the wider literature on technology and productivity, most of it based on the experience of corporations and not-for-profit organizations. “Implementation cycle” research points to three or four stages of information technology integration that occur over years, not weeks or months:
Stage 0. Some planning, investigation, and experimentation. Recognition that the leading competition has already started to use technology. Recognition by some individuals that they can do some of their work better and faster if they use the most widely available functions of a desktop computer. A decision is made to permit small groups to go ahead (or to ignore the fact that they already have done so).
Stage 1. A few years of marked increase in planned capital investment for individual workers/professionals and surprising increases in operating expenses – with little reduction in other expenses. Additionally, there are unanticipated but significant delays in implementing some of the most “obvious” applications. The organization also slowly begins to accomplish some tasks never before attempted and experiences a modest gain in the scale or scope of new activities.
Stage 2. A few years of readjustment where costs and annual investments in technology stabilize while capacity continues to grow and new functions are developed and implemented. (Or, the organization rejects “automation” and/or leaves the business that was being automated.)
Stage 3. Several years where the organization achieves new levels of efficiency and effectiveness – but the organization is no longer really in quite the same “business” it was in the beginning. No one seriously asks if technology increased productivity compared with the “old” ways of working, because the organization is no longer pursuing the old objectives and no longer works in the old ways. No one seriously considers abandoning the technology because it has become inconceivable to accomplish what is now being done without it.
In administrative and other areas where educational institutions engage in functions similar to corporations, colleges and universities can adopt techniques already well-developed for business and move very rapidly through (perhaps even skip) the earliest stages.
In instructional areas, however, the information technology decisions of colleges and universities are more decentralized than in corporations; moreover, the core academic functions are quite unlike those in business. The organizational implementation cycle is more complex and educational organizations are likely to move through the stages even more slowly than industrial organizations of the same size.
On the academic side, most colleges and universities are somewhere in Stage 1 – spending money. Just the same, new technologies that may offer great potential for educational applications continue to arrive – each year, each month, and sometimes each week. And with each major new technology, institutions and departments must again revisit plans and move through the same stages.
Higher education has much to learn about the turnover of technologies and how to move decisively into Stages 2 and 3 cited above. We also need better metrics and models to measure the costs and benefits of technological innovation on instruction. All the while, the current technology infrastructure at most institutions is so taxed and underfunded that campuses are stretched thin just to support the “early adopters” – the first wave of students and faculty drawn to desktop computing and information technology resources. (For a more detailed analysis of issues related to enabling a wide range of faculty beyond the self-starting “early adopters” to use new applications of information technology in their teaching, see “Stuck at the Barricades” by William Geoghegan in this issue of Change. Also, see Steve Ehrmann’s article, “Asking the Right Question,” summarizing many of the problems that emerge when traditional cost-benefit models are applied to higher education.)
Ungrounded (and unchallenged) information technology partisans will mislead campus leaders when they underestimate the real costs, complexity, and duration of the successful implementation process. Without understanding these cycles and their costs, campuses will neither recognize nor attain the full benefits that technology might offer – to students, to faculty, to curriculum, and to institutional effectiveness. The consequence: campuses will be stuck in Stages 1 and 2, never achieving the gains available in Stages 3 and 4.
We cite as one example of this problem the costs of “installing a campus network,” an issue that most campuses have experienced or will soon confront. At face value, the costs appear to be those of running wires into offices, dorms, libraries, and classrooms. But the additional and very real implementation costs include additional equipment, initial user training, continuing user support, and software licenses. Moreover, while the initial installation cost looks like a simple capital expenditure, the technical maintenance and user support costs are a continuing expense that over a few years can easily dwarf the initial expense for installing the wire.
This example mirrors a recent conversation with a college president: the network installation estimates he received accurately stated the costs of “running the wire,” but what he needed for financial and curriculum planning were total costs over the first three years. Yet the estimate on his desk did not include any of the ancillary or support costs. When he eventually got the full costs, they almost tripled the original estimates.
There is another lesson from the corporate experience that is also important for higher education. The successful integration of information technologies is almost always associated with significant structural change – the kind of change that educational institutions routinely resist. In contrast to the pace of corporate restructuring in the United States over the past five years, structural change in education occurs slowly, incrementally, and over a period of many years – decades. Indeed, it is well known that the collegial decision-making process works far better at preserving culture and knowledge than at responding quickly to new technologies and changing environmental issues. Yet given the pressures currently confronting educational institutions – for accountability, quality, cost control, productivity, and organizational efficiency – colleges and universities may have arrived at the moment when they must shift to accommodate change, not just preservation. At that point they may become poised to reap the productivity gains – administrative and academic – on a scale that information technology has helped deliver in the corporate sector.
Some information technology advocates, such as computer center directors and faculty who are strongly committed to applications of a specific technology, may argue with this critique. Some will point to individual cases in specific disciplines where technology helped to increase productivity and/or reduce costs. Some will argue that successful structural changes are already under way, citing distance education as an important example.
But these examples by and large do not address the core, campus-based instructional activities of most faculty at most institutions. Rather, the oft-cited examples of successful integration and potential productivity gains (or effective cost control) typically involve small programs, “early adopter” faculty, and units that receive special support to “make things work.” We have yet to hear of an instance where the total costs (including all realistically amortized capital investments and development expenses, plus reasonable estimates for faculty and support staff time) associated with teaching some unit to some group of students actually decline while maintaining the quality of learning.
Given these somewhat uncertain benefits, why do (or must) colleges and universities invest in information technology, even if the claims for productivity are elusive or just simply a long way off?
There are several compelling reasons why institutions will have to make continuing and significant investments in information technology. They generally fall into three categories: competitive position; teaching, learning, and curriculum enhancement; and student preparation for the labor market.
Competitive Position. Growing numbers of college-bound students come to campus with computer skills and technology expectations. In the fall of 1994 over half (55 percent) of all entering college freshmen reported having had at minimum a half-year of “computer science” or some form of formal computing or technology instruction while in high school (A. W. Astin et al. The American Freshman: National Norms for Fall 1994, Los Angeles: Higher Education Research Institute, UCLA, 1994). Several recent consumer studies suggest that one-fourth to one-third of American households now own a computer and more than 40 percent of recent computer sales in the United States have been to homes rather than small businesses and large corporations.
Consequently, colleges and universities must invest in computers and information technology if only to persuade their potential clientele that the institution provides the information technology resources increasingly available elsewhere, meaning both in homes and high schools, and at competing institutions. In contrast to the explosive growth in home sales over the past year, many students will arrive on campus to find old – and in too many cases, antiquated – computers in campus labs and clusters: the computers students have at home, in their dorm rooms, and in off-campus apartments will be newer and more powerful than the systems available to them on campus.
But the network and online information resources now drive much of campus computing. So in this domain, the continuing institutional investment will focus heavily on the network, particularly “plug and play” options that allow mobile users to connect devices into the campus network at a variety of locations – libraries, labs, dorm rooms, offices, etc. The old competitive reference points describing information resources that used to distinguish between institutions – the numbers of science labs and library books – are being be replaced by a new one: information resources and tools available to students, as reflected by a) the number of locations on and adjacent to campus that support mobile computing and network access; and b) the kinds, quality, and currency of digital resources available online via the campus library or information services center.
What’s the evidence of this transition? Simply spend a few minutes with a standard college guidebook to see the kinds of information these resources now provide about the campus computing and information technology environment. These competitive issues also apply to institutional efforts to recruit and retain faculty.
Teaching, Learning, and Curriculum Enhancement. There is impressive evidence that information technology can be used to enhance courses, curriculum, and student learning. One of the best articles on this topic is “The Technological Revolution Comes to the Classroom,” by Robert Kozma and Jerome Johnston (Change, January/February 1991). They present compelling evidence, drawn from a number of disciplines and a variety of campuses, about the role of information technology as a catalyst for (or enabler of) the qualitative enhancement of the learning experience. Kozma and Johnston identify seven ways, summarized below, that computing and information technology can be used in the transformation of teaching, learning, and the curriculum:
1) “From reception to engagement. The dominant model of learning in higher education has the student passively absorbing knowledge disseminated by professors and textbooks….With technology, students are moving away from passive reception of information to active engagement in the construction of knowledge.”
2) “From the classroom to the real world. Too often students walk out of class ill-equipped to apply their new knowledge to real-world situations and contexts. Conversely, too frequently the classroom examines ideas out of the context of gritty real-world considerations. Technology, however, is breaking down the walls between the classroom and the real world.”
3) “From text to multiple representations. Linguistic expression, whether text or speech, has a reserved place in the academy. Technology is expanding our ability to express, understand, and use ideas in other symbol systems.”
4) “From coverage to mastery. Expanding on their classic instructional use, computers can teach and drill students on a variety of rules and concepts essential to performance in a disciplinary area.”
5) “From isolation to interconnection. Technology has helped us move from a view of learning as an individual act done in isolation toward learning as a collaborative activity. And we have [also] moved from the consideration of ideas in isolation to an examination of their meaning in the context of other ideas and events.”
6) “From products to processes. With technology, we are moving past a concern with the products of academic work to the processes that create knowledge. Students…learn how to use tools that facilitate the process of scholarship.”
7) “From mechanics to understanding in the laboratory. The scientific laboratory is one of the most expensive instructional arenas in the academy. It is costly to maintain…and to provide supervision to student scientists. It is also limited as a learning experience. So much time is required to replicate classic experiments…that there is little lime left for students to explore alternative hypotheses as real scientists do” (pages 16-18).
There are many ways that information technology can enhance the undergraduate curriculum and the student learning experience. The key issue, of course, is the effective use of information technology resources as tools to support instruction and learning outcomes.
There are now good examples to document the successful use of computer software to improve the quality of learning and teaching in each of the categories described above. Kozma and Johnston report from their work with the faculty who received national recognition from the EDUCOM/NCRIPTAL Higher Education Software Awards Program that most award winners needed rive to seven years to develop their own instructional applications. While students in their classes benefited significantly from efforts of individual faculty to develop new instructional software, Kozma and Johnston report that overall, there was minimal dissemination and adoption of this work: comparatively few students, courses, or other institutions outside of the faculty developers’ classes ever benefited from their efforts.
Let’s focus for a moment only on the classes where faculty used the EDUCOM/NCRIPTAL award software to support instruction. Were it possible to accurately calculate (or even estimate) the increases in student learning linked to these instructional resources, the “productivity gains” for individual students would probably produce impressive numbers. Students in these classes generally were not required to pay additional fees or invest much additional time, but they were enabled to learn more – to learn it faster, better, and more comprehensively. Improved outcomes, divided by stable costs, generated increased productivity.
However, calculating productivity gains for any broader universe – even at the level of the individual course – will not show such gains. This second set of calculations would have to include total development costs – the faculty time and related institutional support. This more accurate assessment of total costs dramatically reduces the relevant cost-benefit ratio. Moreover, the true development costs are quite large if spread over only a few hundred or even a few thousand students.
Labor-Market Preparation. There is no question that technology skills will be essential in ever-increasing portions of the labor market of the 21st century; the use of computer and other information technologies is becoming prevalent across all fields and occupations. Consequently, colleges and universities would be doing a major disservice to their students if they failed to provide appropriate opportunities (including structured curricular experiences) to develop and enhance information technology skills as part of the undergraduate experience.
In this context, one of the best statements about information resources comes not from an occupational task force or industry group, but from the American Library Association:
To be information-literate, a person must be able to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information…. Ultimately, information-literate people are those who have learned how to learn. They know how to learn because they know how knowledge is organized, how to find information, and how to use information in such a way that others can learn from them. They are people prepared for lifelong learning, because they can always find the information needed for any task or decision (Final Report, American Library Association Presidential Committee on Information Literacy, January 1989).
Information access, or information literacy (to use the ALA term) will be so vital for the growing cadre of knowledge workers and professionals in the coming century; consequently, the challenges information technology poses cut across all academic disciplines and across all occupational and professional fields. It is an issue higher education institutions across the United States cannot ignore – but one that many faculty have no idea how to address and for which few teaching materials have been designed. It is an area where communication, cooperation, and collaboration among faculty, faculty support staff, and librarians will be essential.
So there are good reasons – other than “institutional productivity” – for colleges and universities to invest in information technology for teaching and learning. These reasons, described above, can be explained in terms of productivity gains for individual faculty and, especially, to individual students. But these gains do not offer the kind of progress described earlier in Stages 3 and 4; and many colleges and universities confront pressures that force them to seek much greater gains (and the accompanying reduced costs).
So why are institutions not making more progress toward Stages 3 and 4?
Infrastructure and limitations in user support are the central issues that prevent colleges and universities from reaching Stages 3 and 4 in the educational use of information technology. Most campuses have barely begun to provide the capital investment – computers, telecommunications links, adequate technical support staff – required to support significant gains based on the effective integration of information technology. One example: when user support levels (personnel and dollars) are compared to widely cited standards for corporations, colleges and universities are often running at one-half to one-fifth, or less, of recommended levels. Some observers may (incorrectly) cite this as productivity, a case of doing more with less; rather, it is important evidence that many things are probably not being done well or right, or at all. Additionally, it highlights the often hidden, often unacknowledged – but nonetheless real costs of user support as a key component of the overall costs of the institutional investment in information technology.
Moreover, many senior campus officials view the technology infrastructure – equipment, software, and support personnel – as a black hole for money. They also often view information technology as a centralized service (similar to the library) that is an easy target for budget cuts in times of financial difficulty. Additionally, technology resources are expensive yet have a short half-life, often less than 15 months. Most campuses do not have an amortization plan for acquiring and retiring needed equipment and software that becomes obsolete quickly. (See “Paying the Digital Piper,” by Kenneth C. Green, in this issue of Change.)
Many campuses fit a pattern that leads to a crisis in this area. Increasing investments (campus dollars or external grants) support the expansion of the information technology hardware base on campus – most recently, the campuswide network. Little attention is given to the accompanying increased demand for technical support personnel who keep the new additions functional. Meanwhile, more students and faculty notice the arrival of the gizmos, widgets, and stuff; they begin to ask “How can I use that new jack in the wall? When can I use this Internet I hear so much about? Who will train me to use Mosaic? How soon before I can bring this into my classroom?” These questions may be asked at the same time institutional financial pressures have led to a hiring freeze or staff reductions. Not only do faculty who depend on technical support staff for basic services find support less accessible, but faculty eager to explore information technology applications in their instructional and scholarly work also experience problems because the support staff who handle technical issues or training simply are not available.
So what happens then when institutional pressure increases to support distance education and other pedagogical and content changes? The need for additional faculty support services to facilitate these major transitions not only increases and becomes still more varied, but it often is recognized too late. Pedagogical change enabled by technology requires development services that help faculty understand, adapt, and adopt new teaching approaches. As many of the new pedagogical approaches rely on new ways for students to access new kinds of information resources, increased and more sophisticated library support services also are required.
The all-too-likely and unfortunate outcome: user support declines just as faculty interest and aspirations reach “critical mass.” The hopes for engaging, significant, and exciting changes in academic activities – enabled by technology – are unfulfilled as the gap widens between the level of support services needed and the level available. The institution remains stuck in Stage 1 or 2.
There is another important dimension to this discussion. Amidst all the conversation about using information technology to enhance instructional productivity, the “client’s” perspective seems missing. How much and what kinds of information technology do our clients – students – view as essential, beneficial, and/or convenient? And how much and what kinds of information technology do they view as serving faculty or institutional interests, rather than their own?
The 15 million students/clients enrolled in U. S. colleges and universities represent many different markets for educational training and services, ranging from full-time freshmen and medical students to part-time students in community colleges and in MBA programs. Instructional models and technologies that are appropriate for – and effective with – some populations or in some disciplines may not work well for others.
Some interesting innovations such as Mind Extension University (MEU) use cable to bring college courses into homes at all hours. MEU students can even tape the lectures for viewing at a more convenient time, much the way many traditional students might copy reserve reading assignments. But even with dramatic growth, MEU serves a very small percentage of the clientele for higher education in the United States. And part of MEU’s costs are leveraged because it distributes content – video courses – developed by faculty based at traditional campuses across the country. For MEU’s clients, cable offers added value: traditional content plus significant convenience. Similarly, for MEU’s suppliers (participating faculty and institutions), MEU’s distribution agreements also represent added value in the form of new markets and revenue that do not compete with core clients.
But the information technology that adds value to MEU’s clientele can also be inappropriate, impersonal (and potentially overpriced) instruction to an 18-year-old college freshman or an executive MBA student. In short, information technology is not a “one-context-serves-all” solution. Yet the kinds of productivity gains or cost savings promised by advocates require a mass, rather than unique application of the resource.
Those who believe that technology provides the “silver bullet” on productivity and quality should look at General Motors during the early ’80s. Seeking a quick fix to quality and productivity problems, GM invested heavily to bring technology to its manufacturing plants. By one estimate, GM spent $50 billion on automated assembly lines and robots. A decade later, GM could report only marginal gains in quality and productivity: moreover, GM continued to lose market share and fall behind on quality compared to its Japanese and U.S. competitors.
In contrast, a visit to Honda, Toyota, and Nissan plants in the United States, as well as to Ford’s assembly lines, reveals only average use of robotics in manufacturing and assembly – yet quality and market share have improved for these companies.
What happened? Why did GM fail to realize real benefits from its massive investment in technology? The problem was that GM poured money into technology but paid little attention to the overall design and manufacturing process: the new technology on the assembly line could not resolve problems in the product and in the workforce. The GM experience, widely cited in Total Quality Management circles, highlights the role of technology as one of many tools, rather than the tool, to enhance quality and improve productivity. At GM – as in higher education – the underlying process and employee participation are important: training, teamwork, design, client needs and requirements, and support services all affect outcomes – be it car quality or classroom learning.
The real long-term academic benefit of information technology will be what it brings to pedagogy and the curriculum – additional resources that enhance both the instructional tools used by faculty and the learning experience of students. Ample evidence documents the benefits for the learning experience. Technology provides access to image databases (satellite photos of the cosmos or the California coastline); statistical databases (such as Census data) that students can use for class projects; remote libraries (which supplement resources available from campus facilities); and more.
It is in this context that the information technology implementation efforts now under way at colleges and universities across the country pose two great risks:
1) Many institutions will follow GM’s path by focusing on technology, with inadequate attention to other components of the learning process. This will lead to marginal (if any) gains, great individual and organizational frustration, and ultimately, to unrealized potential.
2) Only a few institutions will have the vision, financial and personnel resources, and the commitment necessary to achieve the educational potential of information technology – providing access to superior learning options for students and new levels of faculty productivity.
Despite the time and money invested to date, colleges and universities are still in the “flat part of the learning curve” in the area of educational uses of information technology. Institutions, departments, and faculty are still experimenting with using familiar technologies in new and different ways, with both traditional and new clientele. Additionally, past experience suggests that new technologies always generate unanticipated applications – and benefits. In other words, the wisest technology advocate or planner cannot anticipate all the ways that new technologies might be used to enhance instruction and scholarship.
Colleges and universities still have much to learn about how to develop a new information technology infrastructure that provides instructional and curricular benefits. We must measure our great aspirations and institutional investments against what information technology can really provide, not what we hope (or fear) it might do.
For today and into the realistic future, most students at most colleges will continue to pursue campus-based and classroom-defined educational experiences. Technology notwithstanding, instruction in virtually all these classrooms will continue to depend primarily on faculty.
Clearly, information technology can support changes in the traditional fagulty role. Growing numbers of faculty are gaining experience using the Internet for collaborative discourse in their disciplinary specialties and for pursuing their own research agendas; however, very few have begun to train their students in these same skills. (In many instances, students are training the faculty.) As noted above, navigating the network now provides access to resources previously unavailable to most students (and to many faculty). This shift reflects a dramatic change in kinds of learning resources that can come into the classroom – dramatic shifts in the content available for instruction. And content – the material used to stimulate learning, analysis, synthesis, integration and mastery – combined with new modes of communication, provides the new foundation for education’s great aspirations for information technology.
What information technology does best – or will do better as it improves – is deliver content and provide access to information and to other people. It allows students and faculty to find and manipulate information, to take new meaning from it, and to have new (learning) experiences. In the near term, however, the demand for faculty guidance and intervention – for faculty mentoring – is much more likely to increase than to decrease. Students will continue to need faculty to provide the conceptual framework and motivation that enable them to seek and integrate new information. They also will need someone to introduce them to the most effective ways to approach the Internet for the purposes of acquiring information for a particular academic discipline.
But what, then, about the new Grail of productivity? Unfortunately (or, fortunately!), there is little if any evidence that information technology will reduce faculty involvement in instruction (that is, reduce the cost of instruction) in the next few years. Technology enhances the content of the curriculum, the materials that faculty use as a catalyst for learning; it also enhances the options for communication with and among students. And technology can help faculty adjust the syllabus to respond to changing issues and environmental opportunities, be these the shifting maps of Eastern Europe or the human genome. Admittedly, there are exceptions, most of which center on “skill-heavy” applications: the use of computer-based instruction for introductory logic and other courses have effectively altered some components of the course syllabus and the need for the faculty role in some instances – but these courses are still quite the exception after 20 years of demonstrated effectiveness.
Perhaps the changing demographics of higher education’s clientele – the growing population of non-residential, part-time, older students – will continue to make distance education an attractive option. Consequently, some instructional uses of information technology will be more widely sought and acceptable. But, over the next decade, higher education will be shaped by two demographic forces: rising numbers of baby boomers who are coming back to campus for additional education, coupled with a tidal wave of the children of the baby boomers (Boom II), making their first appearance on campus. Parent and child will not necessarily want, need, or appreciate the same kind of instructional methodology.
So what do we advocate and what can we hope for? What are reasonable aspirations for information technology if campuses cannot realistically expect their technology investments to reduce the costs of instruction in the next few years’?
We suggest that each college and university engage in an institutionwide planning initiative that looks carefully at the ways information technology can be used most effectively to improve teaching and learning. This review must also include a careful assessment of the full costs (hardware, software, faculty time, support services, etc.) and potential benefits of various alternatives. A promising framework for setting and achieving realistic goals in this arena recognizes four basic categories of information technology benefits that differ significantly in the kinds and levels of faculty support services required:
1) Personal and institutional administrative productivity;
2) Enhancing traditional teaching;
3) Changing pedagogy; and
4) Changing content.
The first category, personal productivity, reflects information technology applications that have no direct effect on the teaching or learning experiences of current faculty or students. Word processing, spreadsheets, and electronic mail are good examples of individual productivity. So, too, are automated course registration systems. These are examples of productivity gains that have minimal impact on what or how things are taught and learned. Faculty support services needed in this category are primarily technical – largely introductory training plus hardware and software maintenance.
Campus efforts to enhance traditional teaching (the second category), include the broad effort under way at campuses across the country to “wire” classrooms with computers and projection devices. Showing a rotating three-dimensional image of a molecule or a cross-section of a human brain adds to the quality of content and communication in class without necessarily changing the underlying teaching approach or curriculum. Faculty support services needed in this category are primarily technical: introductory training and hardware maintenance, plus discipline-specific or course-specific guidance about the availability and acquisition of computer-based materials.
Changing pedagogy, the third category, draws on many issues raised by Kozma and Johnston, cited above. Good examples here include supplementing science experiments with computer-based simulations (such as A.D.A.M.) or the use of networked computers to exchange drafts and editorial comments in English composition courses. Additionally, this category might also include distance education initiatives, which must acknowledge the difference between students being present in a conventional classroom and participating via telecommunications. Faculty support services needed in this category include the technical: introductory training and hardware maintenance. However, discipline-specific or course-specific guidance about the availability and acquisition of computer-based materials is essential. Also important is the availability of introductory and training materials and services to enable faculty members to understand how they can modify their course preparation and conduct. As faculty pursue these options more extensively it also becomes necessary to involve other campus services such as the library and bookstore in helping to find, select, and make available for convenient and effective student use new combinations of traditional books and reprints and new media such as computer software, CD-ROMs, and Internet access.
The fourth category, changing content, represents the “final frontier.” In some disciplines, the use of information technology in research and field work already has changed how scholars think of their work and the focus of their activities (e.g., textual analysis, mathematical proof by exhaustion of all options). In other areas, scholars have discovered that information technology now permits them to represent and manipulate information and ideas in ways that were nearly impossible previously (e.g., geographical information systems, theatrical lighting designs, musical notation, and performance). In both cases, scholars believe they must teach the “new” material and that they must engage their students in using the relevant technology. In this context, information technology becomes both the reason for – and means of – changing curriculum content. Faculty support services needed in this category are even beyond those described for category 3. Faculty need to be working with publishers, disciplinary associations, and peer groups to revise the curriculum and make sure teaching and learning materials reflecting the changes in content and use of technology are appropriately available.
What about productivity? There is only one category in this framework – the first – where productivity gains (reduced costs, enhanced quality, broader access to learning) are readily conceivable. And these gains generally accrue to the individual, rather than the organization. The other categories offer improvements and increased value, but usually in conjunction with substantially increased investment and support costs.
Is there a current analogy for the case study of the calculator? No, although electronic mail comes very close. E-mail has the potential to enhance communication among faculty and between faculty and student. It supports a wide variety of instructional approaches. Moreover, in most instances and for most users, e-mail is a sunk cost of technology: the costs of adding electronic mail to a student or faculty computer are minor, even minuscule.
There are already numerous examples of the creative use of e-mail that involve students and faculty who do not believe that their activities necessarily reflect a significant technological achievement. Rather, they believe that e-mail represents a modest but important gain, one that requires very little institutional service or support.
But this one example aside, at present, there is no more recent instructional equivalent of the calculator replacing the slide rule.
This review suggests that content, curriculum, and communications – rather than productivity – are the appropriate focus of – and rationale for – campus investments in information technology. But even if this argument is compelling, we must still be careful not to foster inappropriate expectations. Information technology enthusiasts must avoid irritating too many people by making too many promises that cannot be kept. Technology advocates must sustain the good will and realistic expectations of a wide audience of information consumers and providers so that all participants in the educational enterprise will be, at minimum, cautiously receptive when the next course-specific application of great potential comes along and when the first great across-the-board instructional tool appears.
There is also still much to learn about the costs and benefits associated with bringing a group of people into the same place (classroom) at the same time versus having them interact using computers, video, and telecommunications. Each institution, department, and faculty member must find the right balance in forming combinations of traditional practices and materials with new ones.
These are important issues for higher education. Advocates and evangelists must make promises carefully, managing both expectations and limited financial resources with great care. We must be honest with ourselves, our sponsors, and our clientele about the applications and limits of information technology. The academic enterprise can do great things with – and will experience significant benefits from – information technology. But it won’t be cheap, and it will not save money soon. The information technology investment, however, will make a qualitative difference in the way we teach, the materials we teach with, the structure of the college curriculum, the learning experience for students, and how we exchange information – both with colleagues and as faculty interacting with students. Goals and aspirations for quality and academic productivity will be achieved only over many years and through developing and providing the right combinations of
* individualized, interactive access to information for students and faculty;
* guidance and support for and from faculty; and
* social context within an educational institution or a “learning community.”
These are not small challenges. But they are also important challenges that higher education must address with realistic expectations and objectives, mindful about real costs, attentive to the capacity to deliver, and focused on the needs of an increasingly heterogeneous clientele in a rapidly changing world.
RELATED ARTICLE: WHAT DOES IT COST?
What does it cost to support instruction in a classroom? Much depends on the kinds of “technology” in use.
White board (4[feet]x8[feet]) $ 150
Overhead projector 350
20[inches] TV with VCR 600
Multimedia computer with projection via
overhead projector 2,700
Multimedia computer with projection via
overhead RGB/TV device 4,700
20-unit networked computer lab with
projection device 60,000
Of course these costs reflect just the initial purchase price, not sustained support and operating costs, which vary tremendously. For example, the three-year costs for a 4[feet]x8[feet] white board might be $100 or $150, mostly for markers, erasers, and cleaning fluid. In contrast, the three-year costs for a single-unit, multimedia computer/projection system could easily surpass purchase costs, given software upgrades and some technical assistance. In all instances, the three-year costs approach – and can easily exceed – the initial purchase price.
One other important point: the half-life of the white board is much longer than the half-life of the multimedia computer system or a 20-unit instructional computer lab. This, too, affects the overall, multi-year costs.
RELATED ARTICLE: A.D.A.M. – THE COST OF CURRICULUM TOOLS
A.D.A.M. provides a relevant example of a relatively successful commercial instructional software package. It is a multimedia, CD-ROM-based set of programs in human anatomy and physiology that is widely used to supplement instruction in biology, anatomy, and related courses, even in medical schools. The software permits students to “explore” and “see” a human body in ways that are not otherwise possible – not even through working with a cadaver.
Faculty who have adopted A.D.A.M. report that their students are learning more, better, easier, faster – at a cost per student that is competitive with most textbooks. It is obvious that the “student productivity,” or “learner productivity,” has increased for those who use A.D.A.M.
But A.D.A.M. is not cheap: it is available for an institutional purchase, rather than student resale. According to the co-producers, A.D.A.M. Software, Inc., of Atlanta and Benjamin/Cummings Publishing of Redwood City, the development effort required four years, 15 medical illustrators, 10 computer programmers, and over $6 million.
Was this a cost-effective investment for the developers? Probably so, since A.D.A.M. is broadly viewed as a quality product and, despite the price ($800-$1,300), is widely used in undergraduate and medical education. The size of the market, the acceptable price, and the development cost seem to be in balance, slightly in favor of the publishers. There are not yet many other examples of instructional software for which this is true.
But it is also hard to argue that any single college or university has achieved more than a negligible gain in productivity by using A.D.A.M.: the number of students taught in these courses has not increased; tuition has not increased; nor have the number, kinds, and combination of faculty teaching these courses using A.D.A.M. changed. There may, however, be some marginal gains in the reduced number (and expense) of certain laboratory requirements that have been replaced by student use of A.D.A.M.
The point is that while A.D.A.M. represents a worthwhile investment, a commercial success, and an innovative application, it has not been responsible for instructional productivity.
RELATED ARTICLE: INFORMATION TECHNOLOGY AND THE CORPORATE QUEST FOR PRODUCTIVITY
In case you missed it, the Industrial Age passed into the Information Age in 1991: that’s the year, according to a recent report in Fortune, that corporate spending on information technology surpassed corporate investment in manufacturing technology.
But has the corporate investment spurred great gains in productivity? Alas, no, much to the surprise (and disappointment) of many analysts. Several recent studies suggest that the contribution of information technology to corporate productivity has been marginal at best. One example: Harvard’s Gary Loveman, studying manufacturing companies in the United States and Western Europe, reports that “information technology capital had little, if any marginal impact on output or labor productivity, whereas all other inputs into production – including non-information technology capital – had significant positive impacts.”
Why no big productivity bang for the bucks? As reported recently in Business Week (January 16, 1995), research by two economists – Daniel E. Sichel at the Brookings Institution and Stephen D. Oliner at the Federal Reserve – reveals that computers and peripheral equipment contributed at best only one-tenth of the growth in business output between 1987 and 1993. Sichel and Oliner speculate that the reason computers contribute so little to growth, despite what Business Week calls the “digital explosion” of the past 15 years, is that the installed base of computers represents a tiny share – just 2 percent – of the nation’s total capital stock.
The two economists also note that the short half-life of information technology resources, coupled with corporate amortization policies, also reduce economic growth linked to technology investments. Says Sichel: “I’m not suggesting computers haven’t brought about efficiency gains for individual corporations. It’s just not the story for the economy as a whole.” Sound familiar?
Kenneth C. Green is professor-in-residence of higher education at the University of Southern California. Steven W. Gilbert is director of technology projects for the American Association for Higher Education. Their 1986 Change article, “The New Computing in Higher Education,” was recently republished in The Best of Change (July/August, 1994). This work was supported, in part, by grants from the Spencer Foundation, the Lilly Endowment, the AT&T Foundation, Apple Computer, Inc., and the National Association of College Stores. The authors retain the copyright for this article, which is partially based on their article in the January/February issue of Academe. They wish to thank Ted Marchese for his thoughtful and clarifying comments on an earlier draft of this article.
COPYRIGHT 1995 Heldref Publications
COPYRIGHT 2004 Gale Group