A tool for integrated curriculum management

Course assessment plan: A tool for integrated curriculum management

Bailey, Margaret

An Educational Brief

ABSTRACT

As we enter the 21st Century in engineering education, a common desire exists to improve curriculum structure, integration and assessment. Much has been written and discussed concerning the process for assessing and/or revising a program curriculum. Studies are beginning to show the positive effects of well-integrated curricula where assessment methods are applied consistently. There has also been much written to support individual course assessment and revision. What is missing in many instances is a credible link between program-level curriculum management and course assessment. At the United States Military Academy (USMA) at West Point, an integrating tool within the academy’s assessment model, called a Course Assessment Plan, has been developed and refined. The course assessment process and the resulting written documentation provide an essential link between a program curriculum and its constituent courses. The plan’s process, content, and an example outcome are the major focus of this paper.

I. INTRODUCTION

Over the last few years, and in no small part thanks to EC2000, engineering programs have become united in their efforts to improve curriculum structure, integration and assessment as engineering education moves forward into the 21st Century. The Accreditation Board for Engineering and Technology (ABET) Engineering Criteria for curricular objectives and content states the following [1]:

I.C.2 “(Curricular) objectives are normally met by a curriculum in which there is a progression in the course work and in which fundamental scientific and other training of the earlier years is applied in later engineering courses.”

I.C.3 “The program must not only meet the specified minimum content but must also show evidence of being an integrated experience aimed at preparing the graduate to function as an engineer.”

Much has been written and discussed in workshops and professional journals concerning a department or program-level process for assessing and/or revising a program curriculum. This is a process that is conducted typically by a committee of professors from that program with the goal to assess the entire program by reviewing individual courses and surveying stakeholders (industry, graduates and faculty). Often this process is completed as the firststep in a major restructuring initiative. Recent examples of this include the engineering programs at the Louisiana Technical University [2] and University of Michigan [3]. The thread for accomplishing this goal of providing a coherent and relevant engineering education is integration. Commonly “curricula require students to learn in unconnected pieces, separate courses whose relationship to each other and the engineering process are not explained until late in a baccalaureate education, if ever” [4].

Many programs have traditionally viewed the senior capstone design course as the opportunity to tie all of the disparate engineering science courses together at the end of a four-year program. However, institutions are now realizing they cannot afford to rely solely upon the senior capstone design experience to be the integrator of all previous engineering education. According to Otto and Wood [5], students typically make comments such as the capstone experience is “an isolated experience” or the “tie to analysis courses did not materialize.” As a result, many programs are moving toward the teaching of design throughout the four-year curriculum (see examples at References [6, 7]). Design is now being referred to as both the “cornerstone and capstone” of undergraduate engineering education [8].

Studies are beginning to show the positive effects of well– integrated curricula. Everett, Imbrie and Morgan [9] describe in detail their efforts to integrate engineering and non-engineering courses to improve engineering curricula. Their longitudinal study, which follows freshman groups entering the College of Engineering at Texas A&M from 1994 through 1997, suggests that not only do student retention rates improve, but knowledge retention improves as well. This is evident in post-course exam average grades as well as a reduced number of failures associated with students participating in the integrated program. Refer to Everett et al. [9] for a detailed explanation of a process for creating an integrated curriculum.

Current professional literature is also replete with articles regarding the classroom assessment process for individual courses (see examples at References [10, 11]), but as a result of an over emphasis on only classroom level assessment, other problems may be created. Ernst points out that the “emphasis on course content and on the curriculum as a collection of courses has led to compartmentalization of the learning experience, and away from the integration of learning” [12]. Indeed, while assessment is often touted at the course level, many institutions don’t take full advantage of the data that is provided from the program level assessment process. Ewell describes the difficulty with integration of assessment at different levels and points to the fact that program designers must make sure that the curriculum is in fact being fulfilled by the courses within the curriculum. “Piecemeal changes in individual courses over time … may render the curriculum as a whole unworkable from a student point ofview” [13].

Ideally, course level data would be shared within the department and across departmental boundaries [14]. This would enable all stakeholders in a course to help improve individual courses and the integrated program as a result of the assessment process. Stakeholders in a course could include the department head, professors associated with prerequisite and subsequent courses, laboratory technicians, etc. For example in a thermodynamics course, the stakeholders could include professors associated with engineering mathematics, physics, heat transfer and power trains as well as the technician responsible for the steam and gas turbine laboratories. So while “design” is a useful thread to help tie together some of the courses across an engineering curriculum, the assessment process itself and content of the constituent courses must serve to maintain an integrated program.

Assessment is most effective when it is ongoing rather than episodic. What is often missing is a credible link between program– level curriculum management and course-level or classroom assessment. There must be a solid mechanism in place to facilitate a department’s assessment and improvement efforts at the program and course-level. Therefore, assessment methods must be applied consistently and should be part of an integrated program of assessment and feedback to affect positive change or maintain superior performance [14, 15]. This will enable programs to review what is happening in constituent courses, so that changes do not become driven only by ABET cycles. This also allows programs to view the assessment and improvement cycle as “evolutionary” instead of “revolutionary”.

Over the course of the last five years, the Civil and Mechanical Engineering Department at USMA has developed and refined an integrating tool within the academy’s assessment model called a Course Assessment Plan. This plan includes both a written assessment package and an open, fully collaborative forum for stakeholders at the program and course levels to discuss the course assessment package. The department has found that this Course Assessment Plan provides the crucial link between the program curriculum and the individual courses.

II. PROGRAM PLANNING AND ASSESSMENT PROCESS

The USMA Program Planning and Assessment Model is shown in Figure 1 [16]. Because of the Academy’s link to the military, the model clearly indicates that current and future Army needs motivate the design of the academic program goals. Phase I (referring to Figure 1) occurs at the academy level and involves the articulation of the academy’s learning model. This learning model is dynamically connected in the assessment process through the implementation of periodic academy program reviews.

Phases II and III (referring to Figure 1) occur at the departmental level where programs are designed and assessed annually based on academy program requirements. The Course Assessment Plan supports fully the USMA vision for curriculum planning and assessment by providing the link between design and assessment in Phases II and III. In addition, the plan captures the results each year from Phases III and IV. The plan includes two components and is used to both improve courses and maintain course continuity. The final phase included in Figure 1, Phase IV, occurs at the instructor level. Instructors are encouraged to maintain a teaching portfolio that includes frequent self-assessments based on academy and departmental program objectives.

The USMA model shown in Figure 1 maps well to the new ABET assessment model shown in Figure 2 which illustrates an integrated slow-loop and fast-loop cycle of assessment and improvement [17]. The Academic Program Goals and its inherent assessment process correspond to ABET’s slow-loop process. Phases I through IV of the USMA model correspond to ABET’s fast loop, although not all of the phases are addressed annually or on a “fast” track. The Course Assessment Plan captures assessment results from Phases III and IV and provides these results to the program assessors. Each of the 34 courses taught in the department annually produce an assessment package created using a consistent format and data structure. Therefore, program directors and the department head have at their disposal 34 written assessment packages allowing them to more clearly detect trends and identify areas for improvement.

III. THE COURSE ASSESSMENT PLAN

The Course Assessment Plan serves many purposes. It offers an opportunity to collect all of the previous year’s assessment data into one package with narrative that attempts to quantify what the data represents. It also provides narratives, justifications, and impacts of proposed course change, as well as assessments of past changes. From a departmental perspective, the plan provides the link between program curriculum and constituent courses. The Course Assessment Plan open discussion forums are an opportunity to review annually each course to ensure that it is integrated within the department supporting the current vision and maintaining its links fore and aft to sequential courses. The department has also found the process as a means for reducing redundant material and allowing courses to truly build upon each other. The resulting documentation package provides a running history of all courses within the department and assists in creating a foundation of narratives and statistics upon which to base both internal and external program reviews.

Over the past several years, the faculty members of the Department of Civil and Mechanical Engineering have generally found that the benefits associated with creating a Course Assessment Plan surpass the overhead and effort required to produce the report and organize the discussion. McKeachie [18], Csikszentmihalyi [19], and Deci and Ryan [20] all agree that faculty are intrinsically motivated and have limited extrinsic motivation. This is demonstrated clearly within the department in regards to the assessment process. Faculty members discover intrinsic rewards from working with others to collectively improve curriculum, very similar to what many researchers experience in a positive, team, research environment. As discussed in this paper, the Course Assessment Plan frequently leads to course modifications, which ideally have a positive impact on topical continuity, student performance, and student learning. Faculty members recognize these effects and the result is an intrinsic reward and the general acceptance of the assessment process.

Although developed at a military academy, the Course Assessment Plan is by no means a process or document suited solely for this environment. In civilian universities, many departments may already utilize various components of this process, for example the Chemical Engineering Department at North Carolina A&T State University has now established course committees to improve student learning [21]. The Synthesis Coalition and Columbia University have also presented to the community ideas for capturing assessment data at the program level (see References [22, 23]). However, few have adopted the methodology presented here. The following sections describe the Course Assessment Plan in more detail. As previously mentioned, the assessment plan consists of two components, a written assessment package and an open forum. Although the two components complement each other and are closely related, separate sub-sections are dedicated to each component for clarity.

A. Course Assessment Plan Written Package

In order to prepare for the assessment open discussion, the course director of each course creates a draft report that includes a collection of narratives, assessment data, analysis of data, and proposed course revisions. The course director’s role throughout the academic year is to teach the course as well as administer course related requirements. For example, a fluid mechanics course director may teach several sections of a twenty-section fluids course along with other instructors. However, the course director also prepares the Course Assessment Plan and coordinates the creation of common course wide examinations, design projects, quizzes, etc. While preparing the assessment package, the course director gains a better understanding of the course’s overall condition, by conducting both qualitative and quantitative assessments.

Included in Figure 3 is a sample outline of a typical Course Assessment Plan package. The first section within the report includes general course information such as the course description (as listed in the Academy’s official course listings) and the most current course syllabus including course objectives, lesson topics, assignments, standard policies, textbook, etc.

The second section of the document is dedicated to qualitative and quantitative assessment. In order to perform quantitative assessment, the course director obtains numerical statistics through several sources. Data from course end student feedback is collected each semester using a web-based survey system. This customizable system allows, for example, students to self-assess their mastery of course objectives. A summary of the data available from the surveys is included as Figure 4. This information allows the course director to compare the course feedback to other department and academy courses. Course average grades awarded to students over the past five semesters (using a criterion-referenced grading system) is also included in the report as well as respective average incoming grade point averages. This assists the course director in exploring possible trends in student performance. The final source of numerical data available to the course director is the average amount of time spent by each student in preparation for each lesson. Students are requested to complete a time survey each lesson that records the amount of time the student spent working on the course since the last lesson period. This data is tracked for the five previous semesters and an example of this data is included in Figure 5. Refer to Ressler and Lenox [24] for more details on how time data is used as an assessment tool.

Referencing both quantitative and qualitative statistics, the course director assesses how effectively the course supports course, department and Academy objectives as well as how well students accomplish the course objectives. Special emphasis is placed on assessing recent past course changes and where possible evaluating the impact associated with those changes. An example of this assessment process is shown in matrix form in Figure 6.

The final section of the written document is dedicated to future course recommendations. This section includes all proposed course changes as well as the justifications and impact statements associated with the changes. Suggested course changes may be due to a change in curriculum; change in current technology, text or teaching techniques; or feedback via course assessment.

As discussed above, a great deal of information is included in this written package. The time requirement associated with developing the package for an existing course is approximately one and a half days per year. This is minimized by the existence of a well-maintained course continuity file and a streamlined conversion technique from raw data to useful graphs. For new or significantly revised courses, the time commitment required may increase by at least a factor of two for the initial year and then decreases in subsequent years. In addition, improvements to the Course Assessment Plan are ongoing. For example, by assessing this assessment process, the faculty have recognized that a need exists to establish a more consistent methodology for textbook selection.

B. Course Assessment Plan Open-forum Discussion

Course directors develop their written Course Assessment Plan package each spring in preparation for an oral discussion where the written assessment is discussed and peer-reviewed by the department head and other interested individuals at various levels of curriculum management. In addition to those who manage the curriculum, all interested stakeholders are invited to attend the forum to ensure that their interests are discussed. In addition, the department’s visiting professor(s), who teach at USMA for a year while on sabbatical from home universities, is also invited to attend and participate in discussions. This input can be of great value, because it offers a new perspective on the assessment process, the detailed course assessment, and the overall program.

During the Course Assessment Plan discussion, the course director briefs the highlights from the report. The written package plays three critical roles at this point. First, the written draft package is openly distributed before the discussion to serve as a read-ahead briefing. Secondly, during the open-forum discussion, the package forms the basis of the discussion’s agenda. The ensuing open discussions assist the course director with any necessary final revisions to the Course Assessment Plan written package. Figure 7 shows an outline for a typical discussion conducted during the spring of 2002. Lastly, after the discussion, the written package is revised as necessary to address any changes based upon the discussion. The final, revised written assessment is forwarded to the department head for signature and then maintained for a period of at least five years by the course director. It also becomes a major component of the ensuing overall program assessment.

IV. RESULTS

The Course Assessment Plan discussed in the previous sections typically results in three or four proposed changes to each course per year. These proposed changes will vary greatly in scope. The following section includes representative examples of course changes that have resulted from the assessment process.

A recent change in a civil engineering course illustrates the utility of the Course Assessment Plan and its impact at both the course and program level. The Department of Civil and Mechanical Engineering includes two different divisions along discipline lines. Both divisions conduct course-end surveys for each course taught during a given semester as well as detailed surveys of its graduating seniors. A few years ago, the professor associated with the civil engineering capstone design course found that the seniors were experiencing increased difficulty visualizing three-dimensional structures during the design and modeling phases of design. Upon graduation, on their final program survey, these same seniors were asked to agree or disagree with the following statement: “I can create simple floor plans and framing plans using AutoCAD.” The response choices to this question were on a scale of 0 to 5 ranging from strongly disagree to strongly agree, respectively. The overall response to this specific question was a 3.79 out of 5. In addition, 15 percent of the students surveyed registered either a disagree or strongly disagree response. A similar question was asked to the students completing the course that precedes the capstone course, CE491 Advanced Structural Design. On an end of semester survey, the average student response was 3.89/5.0 to the following statement, “I can create simple architectural floor plans and structural framing plans using AutoCAD.”

Each of these observations agrees with increasing anecdotal evidence from across engineering disciplines that newly minted engineers are having difficulty with two and three dimensional visualization and modeling. This situation had a direct negative impact on two of the outcome objectives for our civil engineering program:

* Develop graduates who can apply the engineering thought process to design components and systems.

* Develop graduates who can use modern engineering tools to solve problems.

Therefore, the deficiency in visualization skills of graduating seniors was recognized as a potential shortcoming within the program. Armed with the capstone design professor’s assessment, the graduating seniors’ survey feedback, and the course end feedback from CE491, the course director of CE491 and other stakeholders of the course, discussed appropriate means for addressing the two and three dimensional visualization issue during the Course Assessment Plan presentation for CE491. The course director proposed, and all concurred, to increase the instruction of three dimensional visualization and modeling in this course by expanding the topic’s coverage. In addition, a more rigorous AutoCAD problem was incorporated into CE491 and tied to a project that the students were working on in CE404 Design of Steel Structures. In this problem, the students modeled the first floor from provided plans and then developed the design and subsequent layout of the second floor plan [25].

Having now completed the loop from assessment to analysis/ discussion and then action, the CE491 course director incorporated the changes and re-evaluated the impact of the course modifications through targeted questions and a critical look at the performance of the following year’s group of graduates in the area of two and three dimensional visualization and modeling. The preliminary results of these changes were presented during the course’s next Course Assessment Plan presentation. When the CE491 students were asked the same question regarding their ability to create a simple architectural floor plan and structural framing plan using AutoCAD, the average response increased to 4.25/5.00 as compared to the 3.89/5.00 in the previous semester. In addition, a question associated with the student’s ability to create and analyze two and three-dimensional structural models of trusses, beams, and frames increased from 4.11/5.00 to 4.44/5.00. The above results are preliminary and the fact that the CE491 course instructors changed over the same period could contribute to the noted increases. The graduating student feedback for these students is not yet available.

The Course Assessment Plan report and presentation is crucial in our interdisciplinary, team taught courses. For example, ME471/EE471 Dynamic Modeling and Control is a course that is typically taught by a team of two professors, one each from the electrical and mechanical engineering programs. Each year, the course director creates the Course Assessment Plan report in close cooperation with the other instructor. The Course Assessment Plan presentation occurs at one meeting with representatives from both departments present. This allows interested stakeholders from each department to gain a dear understanding of course details such as prerequisites, objectives, laboratories, design projects, training aids, etc. Also the results of the course assessment process are shared and proposed course modifications are discussed.

Another department course, ME472 Energy Conversion Systems, has significantly evolved over the past five years and its progression is well documented in Course Assessment Plan reports. The course changes were made due to course assessment as well as technological advances. In 1996, the course began including lessons on exergetic efficiency, hydroelectric power generation, absorption refrigeration, and air conditioning while reducing lessons on coal combustion. These changes were presented and discussed during the Course Assessment Plan presentation. In 2000, again because of course assessment, exergetic efficiency was better integrated throughout the course. In 2001, due to technological developments, current energy policy, military needs, and assessment outcomes, the course again evolved into its current form with a series of fourteen new lessons on various forms of direct energy conversion [26].

Besides the benefits that the Course Assessment Plan provides for course continuity, information sharing and inter-disciplinary course coordination, the final document also serves as a foundation for the ABET reaccredidation visit preparation. The periodic ABET review of our department is greatly facilitated by the existence of these documents for each course. Similar positive ABET related results have been documented at civilian institutions that have successfully incorporated course level committees to review assessment and propose recommendations for change [27, 28]. As discussed in this paper, the final Course Assessment Plan report clearly shows how the course has evolved since the last ABET visit. In addition, the original motivation for course modifications is discussed within the report and, when measurable, the resulting data associated with the changes is included.

V. SUMMARY AND CONCLUSIONS

The Course Assessment Plan can serve as an effective link between course assessment and program curriculum management. In this paper, the components of the Course Assessment Plan are presented as well as a few of the realized results of the process, which include:

* course continuity;

* information gathering and sharing;

* inter-disciplinary course coordination; and

* foundation for ABET reaccredidation visit preparation.

Although developed at a military academy, the Course Assessment Plan is not a process or document suited solely for a military environment. In civilian universities, many departments already utilize various components of this process, however, few have adopted the detailed methodology presented here. Since both military and civilian institutions fall under the purview of the same engineering accreditation process, the Course Assessment Plan has the potential to benefit diverse engineering programs.

REFERENCES

[1] Accreditation Board for Engineering and Technology. Criteria for Accrediting Engineering Programs. Revised 18 March 2000. . (27July2000).

[2] Benedict, B.A., S.A. Napper, and L.K. Guice. 2000. Restructuring for strategic outcomes. Journal of Engineering Education. 89(2): 237-246.

[31 Tryggvason, Gretar, et al. 2001. The new mechanical engineering curriculum at the University of Michigan. Journal of Engineering Education. 90(3): 437-444.

[4] Bordogna, Joseph, Eli Fromm, and Edward W. Ernst. 1993. Engineering education: Innovation through integration. Journal o(Engineering Education. 82(1): 3-8.

[5] Otto, Kevin N., and Kristin L. Wood. 2000. Designing the design course sequence. Mechanical Engineering Design (supplement to the Mechanical Engineering Magazine).

[6] Carroll, Douglas R 1997. Integrating design into the sophomore and junior level mechanics courses. Journal of Engineering Education. 86(3): 227-231.

[7] Wilczynski, V., and S.M. Douglas. 1995. Integrating design across the engineering curriculum: A report from the trenches. Journal of Engineering Education. 84(3): 179-183.

[8] Sheppard, Sheri D. 2000. Design as cornerstone and capstone. Mechanical Engineering Design, (supplement to the Mechanical Engineering Magazine).

[9] Everett, Louis J., P.K. Imbrie, and Jim Morgan. 2000. Integrated curricula: Purpose and design. Journal of Engineering Education. 89(2): 167-175.

[10] Shaeiwitz, Joseph A. 1998. Classroom assessment. Journal of Engineering Education. 87(2): 179-183.

[11] Sheppard, Sheri D., Michelle Johnson, and Larry Leifer. 1998. A model for peer and student involvement in formative course assessment. Journal of Engineering Education. 87(3): 349-354.

[12] Ernst, Edward. 1999. The editor’s page: The three I’s. Journal of Engineering Education. 88(1):1-2.

[13] Ewell, Peter T. 1998. National trends in assessing student learning.”Journal of Engineering Education. 87(2): 111.

[14] McGourty,Jack, Catherine Sebastian, and William Stewart. 1998. Developing a comprehensive assessment program for engineering education. Journal of Engineering Education. 87(4): 355-361.

[15] Angelo, Thomas A., and Patricia K. Cross. 1993. Classroom Assessment Techniques: fl Handbook for College Teacher. 2nd Ed. Jossey-Bass, San Francisco.

[161 Forsythe, George, and Bruce Keith. 1998. Curriculum design and academic assessment: The engineering thought process. Best Assessment Processes in Engineering Education: A Working Symposium. Rose-Hulman Institute ofTechnology, Indiana.

[17] ABET Two-Loop Process. .

[18] McKeachie, WJ. 1982. The rewards of teaching. New Directions for Teaching and Learning: Motivating Professors to Teach Effectively. 10: 7-13. Jossey-Bass, San Francisco.

[19] Csikszentmihalyi, M. 1982. Intrinsic motivation and effective teaching: A flow analysis. New Directions for Teaching and Learning: Motivating Professors to Teach Effectively. 10: 15-26. Jossey-Bass, San Francisco.

[20] Deci, E.L., and R.M. Ryan. 1982. Intrinsic motivation to teach: Possibilities and obstacles in our colleges and universities. New Directions for Teaching and Learning: Motivating Professors to Teach Effectively. 10: 27-35.Jossey-Bass, San Francisco.

[21] King, F., and K. Schimmel. 2001. Using course committees to engage faculty in the assessment of student outcomes. Best Assessment Processes IV in Engineering Education: fl Working Symposium. Rose-Hulman Institute ofTechnology, Indiana.

[221 McMartin, Flora, Eric Van Duzer, and Alice Agogino. 1998. Bridging diverse institutions, multiple engineering departments, and industry: A case-study in assessment planning. Journal of Engineering Education. 87(2):157-165.

[23] The Fu Foundation School of Engineering and Applied Science Outcome Assessment Program. .

[24] Ressler, SJ., and T.A. Lenox. 1996. The time survey: A course development tool that works! Proceedings, 1996 Annual Conference of the e?ican Society for Engineering Education (Educational Research and Methods Division), Washington, D.C.

[25] Ressler, S. J. 1999. AY98-99 Civil Engineering Program Assessment.

[26] Bailey, M.B., and A.O. Areas. 2001. The evolution of an energy conversion course at The United States Military Academy. Proceedings, 2002 Annual Conference of the American Society for Engineering Education (Energy Conversions Division), Montreal, Canada.

[27] Lush, G.B., and C.K. Della-Piana. 2001. Step-by-step guide to course assessment specifically to address ABET’s EC 2000. Best Assessment Processes IVin Engineering Education:Al Working Symposium. Rose-Hulman Institute of Technology, Indiana.

[28] Borland, K., and R. Marley. 2001. A conceptual and strategic process for engineering program assessment: A case study, Montana State University. Best Assessment Processes IV in Engineering Education: A Working Symposium. Rose-Hulman Institute of Technology, Indiana.

MARGARET BAILEY

Department of Civil and Mechanical Engineering

United States Military Academy

R. BRUCE FLOERSHEIM

Department of Civil and Mechanical Engineering

United States Military Academy

STEPHEN J. RESSLER

Department of Civil and Mechanical Engineering

United States Military Academy

AUTHOR BIOGRAPHIES

Margaret B. Bailey is an Assistant Professor in the Department of Civil and Mechanical Engineering at the United States Military Academy, West Point, New York- Dr. Bailey is a registered Professional Mechanical Engineer in New Jersey and is actively involved in energy related research with both industry and US Army agencies. Dr. Bailey received a B.S. degree in Architectural Engineering from the Pennsylvania State University in 1988 and a Ph.D. from the Department of Civil, Environmental, and Architectural Engineering at University of Colorado at Boulder in 1998. She currently teaches Thermodynamics, Controls, Energy Conversion Systems, and Mechanical Engineering Design at West Point.

Address: Department of Civil and Mechanical Engineering, United States Military Academy, West Point, NY, 10996; telephone: (845) 938-4105; fax: (845) 938-5522; e-mail: Margaret. Bailey@usma.edu.

Major Bruce Floersheim was an Assistant Professor in the Department of Civil and Mechanical Engineering at the United States Military Academy, West Point from 1999-2002. He is a graduate of West Point and has served as a commissioned officer in the U.S. Army since 1989. He received his B.S. degree in Mechanical Engineering from West Point in 1989 and a M.S. degree in Mechanical Engineering from Stanford University in 1999. He is a registered Professional Engineer in the Commonwealth of Virginia and is currently attending the US Army Command and General Staff College at FT Leavenworth, KS.

Address: 309 Williams Rd, FT Leavenworth, KS, 66027; telephone: (913) 682-7383; e-mail: bruce.floersheim@us. army.mil.

Colonel Stephen Ressler, P.E., is Professor and Deputy Head of the Department of Civil and Mechanical Engineering at the U. S. Military Academy, West Point. He received a Bachelor of Science degree from USMA and M.S. and Ph.D. degrees from Lehigh University. As an officer in the U. S. Army Corps of Engineers, he has served in a variety of military engineering assignments around the world. He has served for ten years on the USMA faculty, teaching courses in statics and dynamics, mechanics of materials, structural analysis, design of steel structures, reinforced concrete design, design of structural systems, and civil engineering

professional practice. He currently serves as Chair of the ASEE Civil Engineering Division and member of the American Society of Civil Engineers Committee on Curricula and Accreditation. He is a recipient of the 1997 Dow Outstanding New Faculty Award and the 2000 Distinguished Educator Award from the Middle Atlantic Section of ASEE.

Address: Department of Civil and Mechanical Engineering, United States Military Academy, West Point, NY, 10996; telephone: (845) 938-2478; fax: (845) 938-5522; e-mail: Stephen. ressler@usma.edu.

Copyright American Society for Engineering Education Oct 2002

Provided by ProQuest Information and Learning Company. All rights Reserved