Relationship between department rank and college rank in engineering graduate program rankings conducted by U.S. News and World Report, The

relationship between department rank and college rank in engineering graduate program rankings conducted by U.S. News and World Report, The

Vojak, Bruce A

ABSTRACT

College rankings conducted by various popular magazines have generated both considerable interest and controversy. In this work, we present statistical analyses of thirteen years of U.S. News and World Report graduate program “reputation” rankings for engineering colleges and their constitutive departments, using them to reveal the relationship between department rank and college rank. Two important trends are substantiated in this study. First, we confirm statistically that some colleges with a relatively small number of top-five ranked departments place higher in the college rankings than some colleges with a significant number of top-five ranked departments. Second, we observe that college rank is much more closely related to department rank for some disciplines than others, providing additional resolution beyond our earlier work

I. INTRODUCTION

College rankings conducted by various popular magazines have generated both considerable interest and controversy over the past decade. However, while certain groups of prospective students increasingly rely on such rankings [8], a number of academics openly question their validity. Many express concern with the magazines’ choices of measures and quantitative formulas used to obtain the “overall” rank for a college [1,3,5-7,12], as well as with their peers’ choices of data supplied to the magazine conducting the ranking [4,11,13,14].

In the specific case of the U.S. News and World Report rankings, programs at both the college and department levels are ranked. For the purposes of this work, we use the following terminology: “departments” are the specialty units in various “disciplines” (e.g., the disciplines of electrical engineering and mechanical engineering), “colleges” are the broader units (focusing on engineering in this case) that are comprised of multiple departments, and “schools” are the broadest unit of activity comprising other colleges in addition to a college of engineering. Note that when we identify the institution from which a college or department comes, we refer to this larger institution as the “school.”

For the rankings of engineering colleges by U.S. News and World Report, the “overall” rank for a college currently is calculated by using a quantitative formula that incorporates the following measures and weighting [15].

Reputation (40%)-Measured by separate surveys of both academics and corporate recruiters.

Student Selectivity (10%)-Measured by GRE quantitative and analytic scores, as well as by the proportion of applicants accepted.

Faculty Resources (25%)-Measured by student-to-faculty ratios, proportion of faculty in the National Academy of Engineering and with doctorates, and the number of Ph.D. degrees granted in the prior year.

Research Activity (25%)-Measured by total research expenditures and research dollars per faculty member.

In recent years, US. News and World Report assessed 185 engineering colleges in its annual survey [15].

Note that this “overall” rank differs from the “reputation” rank by academics. The “reputation” rank by academics is based on a survey of deans, program directors, and senior faculty who are asked to judge the overall academic quality of engineering colleges on a scale of one (“marginal”) to five (“distinguished”) [15].

Further, individual departments within several disciplines are also ranked. These department rankings are conducted in a manner similar to the reputation survey conducted at the college level. Deans, program directors, and senior faculty are asked to nominate up to ten top schools in each discipline, with the magazine publishing the list of schools with the highest number of nominations in each discipline [15].

It is important to note that in all cases of “reputation” ranking, the academic leaders responding to the survey use their own methods, which could be heuristic formulas, to select top-ranked colleges and departments, based on personal perceptions of relevant criteria.

The present work seeks to circumvent the confounding nature of the “overall” rank by limiting the study of college rankings to the “reputation” rankings by academics, rather than using the “overall,” formula-driven rankings. By doing so, we can eliminate data value and formula bias [10] and reflect back to academics their perceptions regarding the quality of various programs, not those of the formulas chosen by a non-academic journalist.

In this work, we present statistical analyses of thirteen years of U.S. News and World Report graduate program “reputation” rankings for engineering colleges and departments, using them to reveal the relationship between department rank and college rank. In Section II, we explore the relationship between the number of top-five department rankings and college ranking, and in Section III, we address the question of whether college rank is more closely related to department rankings for certain disciplines than for others.

II. TOP-FIVE ANALYSIS

It has been noted that either the ordinary or weighted sum of departmental “reputation” rankings may not necessarily be consistent with the “overall,” formula-driven ranking of the colleges they comprise [2]. That is to say that some colleges with a smaller number of top-five reputation departments rank higher in the “overall” ranking formula than other colleges with a larger number of topfive reputation departments. This is shown graphically in Figure 1 for the number of appearances in the top-five department lists for each of the top ten colleges of engineering (based on “overall” rank) [16]. Rankings for eleven disciplines were provided in the published data, shown here for 2001.

These data would indicate that there are inconsistencies between the number of departments a college has appearing in the top-five lists and the college’s “overall” rank, since a monotonically decreasing number of top-five department appearances is not observed as “overall” rank degrades. In order to determine the significance of this result, we can turn to a statistical analysis for insight.

Modeling the number of top-five appearances as a binomial random variable, we can display (Figure 2) the 95 percent confidence intervals for the number of times that each of the top-ten colleges of engineering has departments appearing in the top-five lists. The data of Figure 2 provide a measure of the uncertainty that attaches itself to conclusions that might be drawn from Figure 1, where the college with “overall” rank 5 is apparently inconsistent with colleges of ranks 1, 2, 3, and even 6. The confidence intervals based on one year’s data tend to overlap in Figure 2, making the apparent inconsistency less striking. Also, it is noted that the confidence interval approximation loses its accuracy unless the number of top-five appearances exceeds five, so it is appropriate to use only for the four colleges with six or more top-five departments.

By increasing the sample size sufficiently, it is possible to obviate the difficulties encountered with drawing conclusions from the data of Figures 1 and 2. Therefore, we expand our analysis to all thirteen years of U.S. News and World Report graduate program ranking data currently available for engineering colleges and departments. Taking thirteen years of data increases the number of trials (sample size) to n = 155 from n = 11. Note that for twelve years there were twelve disciplines ranked and one year where only eleven disciplines were ranked. Also, department rankings for ten disciplines appear in all thirteen years of data and rankings of two of three other disciplines appear in each of twelve years.

The data of Table 1 are based on the compilation of all thirteen years of US. News and World Report rankings of engineering colleges and departments. In this Table, colleges are presented in the order of average “reputation” rank calculated over all thirteen years. In Table 1 and Figure 3 the 95 percent confidence intervals for relative frequency of appearance in the top-five lists are presented.

Several interesting trends can be observed in the data of Table 1 and Figure 3.

First, there are a number of colleges that follow a general trend whereby a larger number of top-five department appearances corresponds to a higher average college ranking. We observe that the colleges ranked #1 (MIT), #2 (Berkeley), #3 (Stanford), #5 (Illinois), #6 (Michigan), #10 (Purdue), #11 (Texas), and #13 (Wisconsin) all have departments appearing in the top-five lists with this type of relationship. An elongated ellipse labeled “A” is drawn in Figure 3 to enclose these colleges. The #9-ranked college (Georgia Tech) sits statistically on the edge of this group.

The colleges ranked #4 (Cal Tech), #7 (Cornell), and #8 (Carnegie Mellon) are enclosed within the ellipse labeled “B” and are positioned below the group A colleges in Figure 3. The #9-ranked college (Georgia Tech) seems to be on the borderline between groups A and B, with the limits of its confidence bounds residing near each group. Each of these colleges in group B has an average “reputation” rank in the top ten but with relatively few departmental appearances in the top-five lists.

Finally, there are a number of colleges that have some top-five departmental appearances, but whose average “reputation” ranking is not in the top ten. These colleges are identified within the ellipse labeled “C” in Figure 3 (including Johns Hopkins, Northwestern, Minnesota, Penn State, Washington, Texas A&M and California-San Diego).

The identity of the colleges in groups A and B does not reveal a reason why the college ranking would be inconsistent with the frequency of top-five departmental ranking. One might consider a hypothesis that the colleges in group B have extraordinary departments that appear in the top-five lists, and thus compensate for the relatively low number of top-five department appearances. This is not supported by the facts: the three colleges of group B have yielded only three first-ranked departments over the thirteen-year period for which the rankings have been conducted, not an indication of extreme strength. Also, the statistical difference between the frequencies of appearances in the top-five departmental lists of the group B colleges and group A colleges is striking; even the high ends of the confidence intervals for the three group B colleges fall at or below the 20 percent frequency of top-five departmental rankings.

A second hypothesis for the differences between the two groups could involve a special accounting for colleges that do not have departments in all disciplines. If, for example, a college was lacking an agricultural engineering department or had an industrial engineering department embedded in a larger unit (e.g., in mechanical engineering), it might be concluded that the reduced opportunity of making the top-five department list could account for the discrepancy between college and department ranking. The difficulties encountered by comparing this hypothesis with the data are four-fold.

First, the frequency of appearance of the group B colleges in the top-five department lists is typically a factor of four or more lower than that of the comparably ranked group A colleges. However, the three colleges of group B have comparable numbers of programs available for ranking as top-ranked MIT (based on a review of these school’s web sites, of the thirteen possible ranked programs, MIT has 10, Cal Tech has 9, Cornell has 11, and Carnegie Mellon has 8). Second, since this study of rankings is based on perceptions, the survey participants can accommodate variations in the exact organizational relationship between units in making their judgments. Third, the point made was earlier that the three colleges of group B have yielded only three first-ranked departments over the thirteenyear period for which the rankings have been conducted-not an indication of extreme strength. As such, for colleges with some missing departments, one would expect their other overwhelmingly dominant departments to appear as first-ranked on a regular basis if they were to contribute so significantly to the college rank Finally, in one sense, it could prove an advantage to have a fewer number of departments, since the focus of resources can be more concentrated, creating increased potential for top ranking.

Based on this analysis, it is concluded that the “reputation” rank for the colleges of group B exceed the reputations of their constituent departments. While it hasn’t been quantitatively analyzed, the extent of the discrepancy makes it unlikely that discipline-based variations (i.e., that departments shown to have a high impact on college ranking are sufficiently highly ranked to result in a high college ranking in spite of the fact that relatively few departments are highly ranked) [16] could account for the majority ofthis effect. Thus, the originally stated observation [2] (that some colleges with a relatively small number of top-five departments are found to rank higher than other colleges with a relatively large number of top-five departments) is substantiated statistically when the full range of thirteen years of ranking data is reviewed.

Taking this analysis a step further, it is interesting to re-order the list of colleges, based on their frequency of appearance in the top-five departmental lists; the data appears in Table 2 and the confidence intervals are provided in Figure 4. In this form it is apparent that two statistically distinguishable groups of engineering colleges can be seen, based on top-five department rankings. There are five colleges whose frequency of top-five department rankings are highest and whose lower confidence bounds do not overlap those of the rest of the colleges with lower frequencies of top-five department appearances.

III. TRANSITION PROBABiLITY ANALYSIS

Having substantiated in the previous section the premise that the number of top-five departmental rankings does not correspond statistically with ordinal rank of the colleges, we now hypothesize that at least some of the variation is due to differential contribution of the various disciplines to the college ranking, i.e., that departments from certain disciplines contribute preferentially to the college rank.

One means of identifying the perceived contributions of departmental rankings to college rank is to observe “transitions” from department rankings in a given discipline to their respective college rankings [16]. Shown schematically in Figure 5, the department “reputation” rankings for a given discipline and the college academic “reputation” rankings can be depicted by a series of buckets labeled according to the relative rank they represent. The relationship between these two sets of rankings can be represented as a series of transitions from one set of buckets to the other.

Consider the situation where the #1, #2, #3, #4, and #5 ranked departments in the electrical engineering discipline come from the #2, #4, fl, #8, and #6 ranked engineering colleges. The various buckets and transitions for this example are shown schematically in Figure 5.

A discrepancy value, A, can be defined for each transition. A is simply the difference between the college rank and the department rank for a given school in a given discipline. Thus, for the previous example where the #1, #2, #3, #4, and #5 ranked departments in the electrical engineering discipline come from the #2, #4, #3, #8, and #6 ranked engineering colleges, A values of + 1, + 2, 0, + 4, and + 1 are observed for the electrical engineering discipline in a given year. Note that A can range in value from -4 (the 5th-ranked department in a discipline corresponding to a the lst-ranked college) to something greater than +25 (the 1st-ranked department in a discipline corresponding to an extremely low-ranked college.) Since in some early years US. News and World Report identified roughly the top 25 ranked colleges by reputation, any college not on this list can be viewed as transitioning to the 25-and-over ranked college bucket.

We can use this approach to observe all transitions from all topfive departments in a given discipline to the ranking of their respective college. For example, this is shown graphically in Figures 6 and 7 for the electrical engineering and civil engineering disciplines, respectively. In each figure, the frequency of each top-five department-to-college transition is plotted as a function of transition (A) for all thirteen years of data. Here n = 65 since there are five top-five departments listed for each discipline for each of thirteen years.

It is interesting to note the general differences in distribution for these two disciplines. The electrical engineering discipline distribution is relatively narrow, uni-modal, and symmetric in nature, nearly centered on the Delta = 0 transition. Conversely, the civil engineering discipline distribution is relatively wide and somewhat irregular, as well as centered around a Delta > 0 transition, which is consistent with a general loss of position between the department ranking and the college ranking for that discipline. It is reasonable to conclude then, that departmental rankings from the electrical engineering discipline are more closely indicative of the college rankings than those of the civil engineering discipline.

We can now explore the statistical evidence to determine to what extent there is a difference in how departmental rankings of various disciplines relate to the college rankings.

To proceed with this analysis, we focus our attention on the set of transitions for which Delta = – 1, 0, + 1. These transitions represent the set of transitions for which department and college rankings are the most alike. Thus, to the extent that Delta = – 1, 0, +1 transitions occur frequently, the departmental ranking is a reasonably good indicator of the college ranking.

Using the properties of a binomial random variable, the 95 percent confidence intervals for the probability of these transitions (Delta = – 1, 0, + 1) for each of the eleven disciplines that appear in nearly every year of U.S. News and World Re)wts’ published data between 1990 and 2002 are shown graphically in Figure 8. These disciplines can be broadly categorized into five discipline clusters with successively decreasing relationship between departmental ranking and college ranking,

cluster A’-mechanical and electrical

cluster B’-aeronautical and computer

cluster C’-materials, chemical, nuclear, and environmental

cluster D’-civil

cluster E’-industrial and bioengineering

The departmental rankings of cluster A’ disciplines are statistically more closely related to college rankings than any of the other departments. Further, the departmental rankings of duster E’ are statistically least closely related to college rankings than any of the other departments. The difference between the departmental rankings and college rankings of cluster B’ and duster C’ disciplines are indistinguishable, as are the difference between the departmental rankings and college rankings of cluster C’ and cluster D’ disciplines. However, the difference between the departmental rankings and college rankings of cluster B’ and cluster D’ disciplines can be distinguished statistically.

We can expand this analysis to include other types of transitions. For example, we can analyze the Delta > 5 set of transitions, where the relationship between college and departmental rankings is very weak. Delta > 5 implies that none of the departments with a top-five ranking appear in the top five colleges of engineering, since for Delta > 5 even the first-ranked department in a discipline represents a seventh-ranked, or lower, college of engineering. Ninety-five percent confidence intervals for the Delta > 5 set of transitions for thirteen years of ranking data are shown in Figure 9. As with the Delta = -1, 0, + 1 transitions, the mechanical and electrical engineering disciplines duster at one end of the spectrum of results, while industrial engineering and bioengineering cluster at the other end. Some ordinal changes in disciplines relative to the Delta = – 1, 0, + 1 transitions are observed. For example, the civil engineering discipline yields better agreement (between department and college rankings) than chemical engineering for the Delta > 5 set of transitions while it yields statistically worse agreement than chemical engineering for the Delta = – 1, 0, + 1 transitions.

One way to display the additional insight gathered by observing multiple sets of transitions is to map two sets of transitions at once. For example, we can simultaneously observe the Delta = – 1, 0, + 1 set of transitions and the Delta > 5 set of transitions in the map shown graphically in Figure 10. The Delta = – 1, 0, + 1 transitions are plotted along the x-axis, while the Delta > 5 set of transitions are plotted along the y-axis. Since the sum of the frequency of these two transitions cannot exceed 100 percent, the upper right-hand region of this map is labeled as “forbidden”.

The data from seven disciplines of Figures 8 and 9 are mapped graphically in Figure 10 in order to reveal distinctions in the relationship between college ranking and department ranking.

First, the mechanical and electrical engineering disciplines clearly occupy the lower-right-hand corner of the map, where close relationships between department and college rankings are found. Next, the apparent reversal of order between the chemical engineering and civil engineering disciplines (when different transitions are analyzed) can be more clearly resolved and interpreted with this map. The chemical engineering discipline is found to reside more in the center of the map (representing a bi-modal relationship between department and college rankings – some very close and some very far), while the civil engineering discipline is found to reside closer to the lower-left-hand corner of the map (also representing a bi-modal relationship between department and college rankingssome departments slightly better ranked and some slightly worse). As such, the change of order between these two disciplines is related to the frequency distribution of transitions for each of the disciplines, with the chemical engineering discipline having some departments very close to college rankings and some departments very far from college rankings, while the civil engineering discipline exhibits a distribution with some departments slightly better than college rankings and some departments slightly worse than college rankings. Finally, additional resolution is observed in the upperleft-hand corner of the map (representing the poorest relationship between department and college rankings) between the industrial engineering and bioengineering disciplines, with industrial engineering statistically exhibiting a closer relationship between college and department rankings than bioengineering.

IV. OBSERVATIONS

From the top-five analysis, it is clear that there are statistically significant discrepancies between college rankings and department rankings for engineering graduate program rankings conducted by US. News and World Report.

It was found that a set of colleges exists for which an increasing frequency of top-five department rankings corresponds to a high college ranking; however, there are several colleges for which this relationship does not hold. Due to the relatively small number of top-five department appearances of this latter group of colleges, it is apparent that discipline-related differences cannot account for this discrepancy. Thus, the rankings for these colleges seem statistically disproportionate to the rankings of the departments that comprise them.

Once the existence of such inconsistencies is demonstrated quantitatively, members of our profession should be motivated to understand them and their implications. Possible sources of rational explanation, not accounted for in departmental rankings, include college-wide programs and programs outside of the colleges for these universities that might enhance the overall reputation of these highly ranked colleges.

It is very likely that the inconsistencies are a result of many years of effective reputation building by these colleges. At some level, this result of the top-five analysis is already internalized within our profession, as evidenced by the flow of promotional literature distributed by colleges of engineering shortly before the U.S. News and World Report annual call for rankings.

Turning to the transition probability analysis, clearly there are some very strong relationships observed between department rankings and college rankings, with the mechanical and electrical engineering disciplines having the strongest relationships to college ranking.

It is interesting to note that these two disciplines often have the largest numbers of students and faculty and have correspondingly large research budgets, both currently and integrated over time. The only other discipline that rivals these two for current size and dollars would be computer engineering, which is often operated jointly with electrical engineering or computer science. Computer science resides within some colleges of engineering but certainly not all of them.

Thus, it is possible that the disciplines that generate the most alumni and research finding will likely also be those that produce the largest number of engineering academic leaders who participate in these surveys. While we have not studied this quantitatively, one could argue that some of the observed results could be accounted for in this manner.

Further, it could also be argued that, given their size, these two disciplines represent the largest number of technical contributions to society in recent history. As such, the visibility of departments in these disciplines might easily draw attention to the engineering colleges that they comprise.

Inversely, disciplines representing typically two of the smallest departments in an engineering college, industrial engineering and bioengineering, exhibit the weakest relationships between departmental ranking and college ranking. This adds further credibility to the argument that discipline size, both current and integrated over time, is a key factor in determining which disciplines have the greatest impact on college ranking.

Note, however, that there are exceptions to this trend. Aeronautical engineering departments, for example, typically are relatively small. However, it is a discipline for which department rank is relatively closely related to college rank. Also, civil engineering, a discipline typically represented by larger departments, exhibits department rankings that are relatively less related to college rankings.

Obviously other factors contribute to the perceived reputation of an engineering college. One of the factors not explored in this research is the impact of the reputation of the overall university on the reputation of the college. Just as there is a relationship between college and department rankings, one would suspect that there is a relationship between college and university rankings. This question was not explored in the current paper but would be an interesting future study.

V. CONCLUSION

This study provides statistical insight into the collective perceptions of the engineering academic community nationally regarding the relationships between college and departmental rankings. The top-five analysis demonstrated that there are statistically significant discrepancies between college rankings and department rankings for engineering graduate program rankings conducted by U.S. News and World Report. The transition probability analysis demonstrated that college rank is more closely related to department rank for some disciplines than others, providing additional resolution beyond our earlier work. Moving beyond the findings, the important issue to understand is the manner in which decisions are made based on these perceptions. Hopefully, these analyses will challenge us to continue asking what makes a great engineering college.

ACKNOWLEDGMENT

The authors wish to acknowledge the assistance of Fernando Diaz in the collection and analysis of data for this study.

REFERENCES

[1] Casper, G. 1996. Letter to James Fallows. , accessed October 16, 2002.

[2] Director, S. 2001. University rankings and the resulting implications on engineering schools in the areas of student and faculty recruitment, research dollars, and tuition costs. Conference on Industry and Education Collaboration. American Society for Engineering Education.

[3] Duffy, B., and P. Cary. 1999. Dissension in the rankings: US News responds to Slate’s `Best colleges’ story. Slate. , accessed October 16,2002.

[4] Golden, D. 2001. Opening arguments: As law school begins, it’s Columbia vs. NYU. The Wall Streetjournal, p. Al (August 8, 2001).

[5] Gottlieb, B. 1999. Cooking the school books: How US News cheats in picking its best American colleges. Slate. , accessed October 16,2002.

[6] Holder, G. 2001. Rankings we could live with. Prism. American Society for Engineering Education. 11(1): 76.

[7] Klein, S., and L. Hamilton. 1998. The validity of the U.S. News and World Report ranking of ABA law schools. , accessed October 16, 2002.

[8] McDonough, P., et al. 1997. College rankings: Who uses them and with what impact. Annual Meeting. American Educational Research Association.

[9] Montgomery, D., and G. Runger. 1999. Applied Statistics and Probability for Engineers. 2nd ed. New York, New York: John Wiley&Sons.

[10] Rogers, E., and S. Rogers. 1997. `High science’ vs. Just selling magazines’? Bulletin. American Association for Higher Education.

[11] Sanoff, A. 1995. Letter to the editor. The Wall Street Journal p. A15 (April 27,1995).

[12] Smith, C. 2001. News you can abuse. University of Chicago Magazine. 94(1):18-25.

[13] Stecklow, S. 1995. Cheat sheets: Colleges inflate SATs and graduation rates in popular guidebooks. The Wall Street, journal, p. Al (April 5,1995).

[14] Stecklow, S. 1995. Education: Universities face trouble for enhancing guide data. The Wall Street, journal, p. Bi (October 12,1995).

[15] US. News and World Reports Best Graduate Schools, 2003 Edition. Engineering rankings methodology.

[16] Vojak, B., J. Carnahan, and R. Price. 2002. The Relative Contribution of Department Ranking to College Ranking in Engineering Graduate Program Rankings Conducted by U. S. News and World Report. Annual Conference and Exposition. American Association for Engineering Education.

BRUCE A. VOJAK

Department of General Engineering

University of Illinois at Urbana-Champaign

RAYMOND L. PRICE

Department of General Engineering

University of Illinois at Urbana-Champaign

JAMES V. CARNAHAN

Department of General Engineering

University of Illinois at Urbana-Champaign

AUTHOR BIOGRAPHIES

Bruce A. Vojak is Associate Dean for External Affairs in the College of Engineering and Adjunct Professor in the Department of General Engineering at the University of Illinois at UrbanaChampaign. After receiving a Ph.D. from that institution in 1981 he held positions at MIT Lincoln Laboratory, Amoco, and Motorola. Prior to joining the University in 1999 he was Director of Advanced Technology for Motorola’s frequency generation products business. He also holds an MBA from the University of Chicago.

Address: College of Engineering, University of Illinois at Urbana-Champaign, 306B Engineering Hall, 1308 West Green

Street, Urbana, IL, 61801; telephone 217-333-6057; e-mail: bvojak@uiuc.edu

Raymond L. Price is a Professor and Severns Chair in Human Behavior in Engineering in the Department of General Engineering at the University of Illinois at Urbana-Champaign. He also serves as the Director of the Technology Entrepreneur Center. After receiving a Ph.D. in Organizational Behavior from Stanford, he held positions at Hewlett-Packard, Boeing, and Allergan, Inc. Before joining the University in 1998 he was Vice President of Human Resources at Allergan.

Address: Department of General Engineering, College of Engineering, University of Illinois at Urbana-Champaign, 104C Transportation Building, 104 South Mathews, Urbana, IL, 61801; telephone: 217-333-4309; e-mail: price1@uiuc.edu.

James V. Carnahan is an Adjunct Professor in the Department of General Engineering at the University of Illinois at UrbanaChampaign. Since 1983 he has taught courses in statistics, simulation and control, and also chaired the industrially funded senior project course for 10 years. After receiving his Ph.D. from Purdue University in 1973, he was on staff at General Motors Research Laboratories and also has served as a consultant for many years.

Address: Department of General Engineering, College of Engineering, University of Illinois at Urbana-Champaign, 20 Transportation Building, 104 South Mathews, Urbana, IL, 61801; telephone: 217-333-9623; e-mail: carnahan@uiuc.edu.

Copyright American Society for Engineering Education Jan 2003

Provided by ProQuest Information and Learning Company. All rights Reserved