Design and evaluation of problem solving courseware modules for mechanics of materials

Design and evaluation of problem solving courseware modules for mechanics of materials

Steif, Paul S

ABSTRACT

Problem solving courseware modules have been developed for students of mechanics of materials. The modules offer students: a better grasp of fundamental principles, an intuitive sense of the meaning of key quantities, and fluency in using relations to solve problems. Students use modules independently and submit electronic log files to instructors who can monitor their progress. Field-testing of the modules was conducted over a two-year period at three distinct educational institutions, with 318 students. Courseware users scored as well as or outperformed their non-using peers on nearly all objective measures of learning, with differences being statistically significant in some cases. Eighty-eight percent of users found the courseware helpful for learning. After completing one required module, 48 percent of students continued using the courseware, voluntarily, as a reference guide to theory, as a coach for mastering and reviewing basic skills, and as a self-diagnostic tool, to help with regular homework assignments.

I. INTRODUCTION

Computers are increasingly touted for their potential benefits to engineering education. Within statics, dynamics, and mechanics of materials, there have been many efforts to develop courseware [1-10]. Some of these efforts are relatively comprehensive, aspiring to provide complete teaching systems (e.g., for use in distance learning). Other efforts are more focused. For example, within mechanics of materials, shear force and bending moment diagrams, and Mohr’s circle are popular topics [4-8].

There is not yet a generally accepted approach to categorizing courseware. Laurillard [11] and Flori [12] have begun to classify types of educational technology. Laurillard singles out the tutorial simulation as coming closest to covering what she considers to be the range of essential learning activities. Tutorial simulations are adaptive and interactive. They adapt to the evolving needs of the student, roughly as an instructor would. Through interactions with the user, tutorial simulations offer students intrinsic feedback: by simulation, the computer can display the consequences of the students’ actions.

The courseware discussed in this paper falls into Laurillard’s category of tutorial simulation. It was designed to address sharply defined, limited pedagogical goals, motivated by a number of significant drawbacks we perceived in the traditional homework experience in mechanics.

II. DRAWBACKS OF TRADITIONAL LEARNING OF MECHANICS

In mechanics courses, students must learn to apply fundamental principles to help in understanding, problem solving, and design. They must learn to apply principles in new and unfamiliar situations. To do so, students must both comprehend the fundamentals, and perceive their applicability to new situations. Textbooks offer a variety of problem types within each topic area. Instructors typically take advantage of this and assign problems with a variety of configurations. In one sense this is good: students learn to perceive the applicability of fundamentals to various situations. However, with this variety comes lack of repetition. In contrast to experts, novices often misclassify problems based on superficial features, rather than based on underlying principles [13-14]. By solving a number of similar, but non-identical problems, students can more easily elucidate the underlying fundamentals, rather than memorize an independent method for solving each type of problem [15]. Focusing their effort on a narrow range of problems, leading to expertise and fluency with those problems, can also indicate to students the depth of understanding that is possible and desirable.

Another challenge faced by instructors is the varied ability of students. A student may need more practice with one idea, less with another. No two students are alike. While each student should ideally be given an assignment which reflects that student’s needs, in practice that is difficult. An instructor cannot know in advance which ideas will come easily to a student and which will require a struggle.

It is also generally true that the feedback that students receive on their homework is relatively ineffective. Feedback usually comes too late; solutions are often made available to students after the week’s homework is complete. By then, however, students are often focused on another course, or on the next week’s homework. Feedback is also usually nonspecific; whether it is the corrections on their own homework or the posted solutions, feedback often fails to zero in on the ideas which are causing the student trouble. Rarely is the student given feedback that pinpoints the difficulty and then offered the chance to solve similar problems with the new knowledge in hand.

III. ADDRESSING DRAWBACKS THROUGH COURSEWARE

In response to the above critique of the homework experience in mechanics courses, a series of problem-solving courseware modules was developed. Each module addresses a key topic in mechanics of materials and focuses on a very limited range of physical situations. However, each physical situation is treated in far greater depth than is usually possible in textbook homework assignments.

The three main goals of the courseware were to help students to:

1) better grasp the fundamental principles underlying various problem types;

2) gain an understanding of what the numbers and equations really mean; and

3) develop fluency in manipulating numbers and equations, as required for solving problems.

To achieve these goals, the courseware was designed to capitalize on the computer’s abilities to compute, to display graphics, to interact with the user, and to adapt based upon past activities. Furthermore, because the computer automatizes problem generation and response to student input, it is feasible to break problems down into a relatively large number of sub-problems or to offer sequences illustrating larger procedures, both of which can affect learning positively [16]. The modules, therefore:

* offer a large number of small problems, many focusing on individual concepts, prior to offering problems that necessitate multiple concepts;

* break procedures into discrete steps, leading the student through the elements of a procedure as a prelude to permitting the student to work independently,

* offer incremental feedback, in which students are initially offered very general hints in response to wrong answers, followed by more pointed feedback;

* pose reverse engineering problems, in which students must load the body to achieve a given result, with the computer displaying immediately the implications of various proposed answers;

* utilize graphics and animation to clarify hard-to-visualize phenomena; and

* perform tedious calculations, letting the student focus on substantive issues.

To realize the potential of the computer, we believe that courseware should go beyond posing electronic versions of textbook problems; it should transform the problem solving experience. The configurations addressed by the courseware here are, admittedly, similar to those in textbooks. However, as seen below, the tasks that the courseware demands of the student are distinct from those demanded by textbooks and, in some instances, are infeasible, if not impossible, with pencil and paper.

Thus, there are some important differences between the approach taken here and that of current software in this field [9-10]. Existing software is intended to give students additional experience with problems that are typical of textbooks, and to offer a computer-based review of the theory. A contrasting approach is taken here. Most significantly, the courseware described here poses problems in which individual principles and quantities are singled out and procedures are treated in a step-by-step fashion. It also provides students more fine-grained feedback on problems, to facilitate discovery of the individual gaps in understanding.

IV. GENERAL DESCRIPTION OF COURSEWARE

The program entitled StressAlyzer [17] consists of six independent modules, each addressing key topics in mechanics of materials: axial loading, torsional loading, shear force and bending moment diagrams, beam deflections using the method of superposition, stress transformations, and applying simple calculations of stress under axial, torsional, and flexural loading to three-dimensional situations. Each module contains from 20 to 40 generic “problems.” While some problems are actually brief reviews of theory, most are genuine problems that request a response from students. All such problems involve geometry and/or parameters that are varied randomly. This enables students to solve indefinite numbers of problems of the same generic type.

Problems are numbered, with the sequence very carefully designed to build up concepts gradually. Thus, it is preferable to do problems in sequence. Nevertheless, as with other types of homework assignments, instructors may require the completion of any subset of the problems, and students can choose to do problems in any order or to skip problems. Initial problems in each module are intentionally very simple, to bring all students on board. However, the courseware allows students who can solve the problems immediately to get “credit” and move on quickly, while students who need more practice receive it through solving multiple problems of similar type.

When the user starts a module, he or she is prompted for his or her name, which is permanently attached to a log file. When a problem is solved correctly on the first attempt, this is recorded in the log file. At any point the user can see which problems have been recorded as solved. Should the user wish to stop working on the program, work can be resumed at a future time by connecting to the same log file.

Log files are text files, which can be emailed to the instructor, or handed in on diskette, to be graded electronically by running a decoding program. The decoding program outputs data in tabular form (for insertion into a spreadsheet), listing which problems have been completed for each student (the name associated with the log file). Given the encoding of the log files, manipulating a log file renders it unreadable; hence, it is essentially impossible for students to copy each other’s work.

The courseware has been designed to maximize accessibility. Generally, only a few minutes of time are required to learn the user interface, with little help from faculty required. Moreover, many elements of the interface are common across modules. StressAlyzer can be run on virtually any personal computer running a Windows operating system.

The courseware was developed by a single developer/programmer discontinuously over a period of approximately five years, requiring on average approximately eight hours per week of the developer’s time. Much of that time was spent developing early prototypes and gradually modifying them based on student input. This courseware is now commercially available, both as a stand-alone CD [18] and bundled with Mechanics of Materials textbooks offered by the same publisher.

V. DETAILED DESCRIPTION OF SELECTED PROBLEMS FROM COURSEWARE

It is impossible to give details on the full range of problems that can be found in StressAlyzer. To give the reader a flavor of the program, however, we offer an in-depth description of the tasks and user-computer interactions associated with three problems, each from a different module. In all cases, there have been simpler problems, or reviews of theory, which lead up to the problems described here. (All sign and unit conventions were defined earlier in the module.)

Figure 1 is a screen capture of a problem in which the student is shown a rod, having portions with distinct diameters and materials (with properties listed). The rod is loaded in an unknown way, but the gray line in the plot gives the variation in axial normal stress along the rod. The student must apply forces to the rod at discrete points to produce the given variation in axial stress. To apply a force, the student clicks on the rod, and identifies the point at which the force is applied and the magnitude and direction of the force. There is a simple calculator, which the student may use if so desired. (Students may also use their own calculator or do the math in their head.) When the student clicks on the Check button, the program will calculate the stress distribution associated with the applied loads and plot it on the same set of axes.

Figure 2 is a screen capture showing the same problem after the student has chosen a set of loads and clicked on the Check button. Those stresses are plotted as the black line. In this case, two types of mistakes have been made. The sign of forces should be reversed to produce tension rather than compression. The intermediate force is not related to the internal force and hence stress at the intermediate segment. For such types of problems, the student receives intrinsic feedback; the results of their actions are displayed, offering the student an opportunity to modify the loads and get better agreement. The student can change the loads as often as desired; clicking the Check button will replot the results. When the program determines that the student’s distribution and the desired distribution are sufficiently close, the student is notified and can do another problem. If the agreement is sufficiently close on the first try (the first click of the Check button), the student gets credit for completing the problem, which is reflected in the log file.

In the problem depicted in Figure 3, the student is shown a simply supported beam with a single force applied to it. The student is lead step-by-step through the drawing of the shear force and bending moment diagrams. This will be done with a series of questions, beginning, as shown, with the value of the shear force at the left end. The major relationships to be used are shown in brief form at the bottom of the screen. (These relationships are derived and reviewed earlier in this module). The response to each question involves entering a number or choosing from several choices. For example, at another step the student is asked how the shear force varies in the portion of the beam from X = O to X = 7, with the choices of constant, linear and parabolic.

In Figure 4, the student has progressed in drawing the shear force diagram. After each correct answer, the associated portion of the diagram is drawn. In addition, the student can review the list of decisions made thus far in the problem under “Steps so far.” In response to an incorrect answer, a series of hints is given that offer increasingly pointed feedback. One such hint is shown. In addition, the relationship that is at issue in the currently posed question is highlighted at the bottom (on the computer screen it is shown as red, whereas the other relations remain in black). When the student eventually gets the answer to this question correct, a problem of similar type is regenerated, and is solved correctly up to just before the current question. The student has an opportunity to answer a similar question correctly without hints, and can then move on to the next step in drawing the diagram.

In the problem depicted in Figure 5, the student is shown an element with stresses acting upon it. The student is asked for the directions of the shear and normal stresses acting on an inclined surface of a particular orientation, which is also shown. To solve this problem, one must identify the values of x-y component stresses [sigma]^sub x^, [sigma]^sub y,^ and [tau]^sub xy,^ from the diagram, use the transformation formulas to determine [sigma]([theta]) and [tau]([theta]), and then choose one of the four options based on an interpretation of the signs of [sigma]([theta]) and [tau]([theta]). When ready, the student clicks the Check button. The program has eliminated the tedious part of evaluating the stress transformation formulas. The student has to enter the correct values for [sigma]^sub x,^ [sigma]^sub y,^ and [tau]^sub xy,^ and the angle [theta] the program automatically computes [sigma]([theta]) and [tau]([theta]) using the transformation formulas. The student then must interpret the signs of [sigma]([theta]) and [tau]([theta]) correctly.

In Figure 6, the student has entered values for [sigma]^sub x,^ [sigma]^sub y,^ and [tau]^sub xy,^ and the angle [theta] and, based on the values of [sigma]([theta]) and [tau]([theta]), has chosen one of the four options (indicated in black). The answer is wrong, and after making a general statement that the answer is wrong (not shown), the program gives a hint as to where the error lies. In this case, the student incorrectly entered a positive value for [tau]^sub xy,^

VI. TESTING AND EVALUATION PROCEDURES

StressAlyzer courseware modules were field-tested during the 2000-01 and 2001-02 school years, by 318 students, attending seven different engineering courses at the University of Pittsburgh, Butler Community College, and Carnegie Mellon University. The evaluation was designed and executed by an outside evaluator (L.M. Naples) to ensure the propriety and objectivity of processes and findings.

Students in each test class were randomly assigned to one of two groups, so that both groups had similar demographic profiles. Two modules were tested in each class, giving all students one experience as a courseware “user,” and one experience as a “non-user.” At the start of each academic “test-unit” in which a module was to be tested, all students were administered an in-class pre-test to assess prior knowledge. During the test-unit, non-users completed a regular “problem set” while users completed a courseware assignment on the same material. All students received the same classroom instruction. Immediately following the test-unit, all students were given a post-test assessment of learning, and a brief, post-unit survey. Following the official test period, and prior to the regular course exam on which that material would be covered, all students were given the opportunity to use the courseware voluntarily, although no one was required to do so. Finally, all students were given an end-of-semester survey, which they were required to submit to their instructor on the day of their final exam.

During the first year of testing, the surveys were used to identify remaining bugs and discern ways to change the courseware to better meet the learning needs of students. The survey data was supplemented by in-depth interviews with selected students. During the second year of testing, the surveys focused on obtaining summative data regarding the quality and utility of the courseware, its impact on students’ study habits, educational outcomes, and confidence levels.

Responses to open-ended survey questions were analyzed for content according to established qualitative methods [19]. Interview transcripts were also coded for content, and analyzed for emergent patterns. Each pattern was described anecdotally, supported by quotations that illustrate the pattern. The number of students who cited a particular idea was taken to indicate the strength of each pattern.

For each of the four modules that were tested, we identified several detailed learning objectives and devised problems that would allow us to separate out performance on each of these objectives. These problems formed the pre/post-tests associated with the module. The objectives, problems, and quantitative results of the pre/post-tests pertaining to the module on drawing shear force and bending moment diagrams are presented below.

The pre/post-tests and post-unit surveys were collated by student, and all student names removed prior to grading and analysis. Test-scores for each educational objective of each test, as well as all quantitative survey responses, were entered into a spreadsheet for analysis by the evaluator. Statistical tests for independence (t-tests or chi-squared tests, as appropriate) were used to determine whether any statistically significant differences in learning (“post-test score”-“pre-test score”=”test score delta”) or confidence levels existed between user and non-user groups. P-values from these tests are reported for all cases in which statistically significant differences were evident.

VII. SAMPLE OF EVALUATION FINDINGS

Extensive qualitative and quantitative data were obtained from this investigation; the full text of the evaluation report is available from the StressAlyzer Web site [17]. Here we offer a sample of some of the findings.

To reiterate, by the end of the semester, each student had completed a mandatory courseware assignment, and had had an opportunity to use other courseware modules, as desired. One measure of the student view of the utility of the courseware was judged by their answer to the question “How did the courseware module(s) you used this semester affect your learning?” The responses are shown in Figure 7, where it can be seen that a significant majority (88 percent) found the courseware to be slightly or very helpful to their learning. One can supplement this measure with the finding that 48 percent of the students voluntarily used one or more additional modules to aid their studies, beyond the time spent completing their mandatory courseware assignment.

When asked to describe the ways in which their study habits had been changed (for better or for worse) by the use of courseware, 103 students (32 percent of those in the study) indicated changes in either the physical procedures they used for study, or the ways in which they thought about mechanics of materials problems, or both. Analysis of student responses indicated that students used the courseware to:

* drill and review for exams (n = 18/103);

* serve as a reference for questions about material (n = 14/103);

* practice problem-solving, in preparation for doing regular homework assignments (n = 11/103); and

* self-diagnose trouble spots in regular problem-sets, using hints and solutions for similar problems in the courseware (n = 8/103).

In addition, students cited a variety of reasons why they found the courseware to be effective:

* forced mastery of basic concepts (n = 49/103);

* taught how to approach problems systematically (n = 19/103);

* boosted confidence via practice with instant feedback (n = 18/103);

* helped them visualize problems (n = 13/103);

* made learning an active rather than passive process (n = 11/103);

* gave them an intuitive sense of what the numbers meant (n = 10/103);

* offered an alternative perspective on the topic (n = 9/103);

* showed how to construct graphs and diagrams (n = 7/103); and

* encouraged them to think about the problem, rather than simply “crunching numbers” to get an answer (n = 3/103).

For example, regarding the forced mastery of basic concepts, we quote: “Courseware had me repeat a problem until I had the correct answer. Frustrating but very effective. Made ideas and procedures stick with you throughout the course,” and “It was good that we can’t skip over the topics, which means we understand the topic fully before advancing.”

Regarding the confidence boosted by practice with feedback, we quote: “Well, for the 1st 2 tests, I studied a total of about ten minutes. I got a 92 and a 90. I attribute my confidence and understanding largely to the Axial module. (at least for the 2nd test.) For this 3rd test, I’m not finished studying, but I’m planning on just doing the Twist & Bend modules, and the “Practice Question #3″ and that should be enough” and “…I felt more confident on the exam because I had done many problems and I had instant feedback (unlike homework or even typical studying).”

Based on the learning objectives established for the module on drawing shear force and bending moment diagrams, the problem shown in Figure 8 was chosen for pre/post-tests. In general, we sought to determine whether students could: (i) get the general form of each of the diagrams correct; (ii) deduce values for the M and V at the ends, given the support reactions; (iii) determine the quantitative changes in the values of M and V at key points.

The assessment rubric shown in Table 1 was used to evaluate pre/post-tests, with user and non-user tests anonymized and intermixed before grading. The mean and standard variations for users and non-users in each learning objective are shown in Table 2, along with the P-value from a t-test. It can be seen that the difference in performance between users and non-users was statistically significantly for three out of six objectives. Similar evaluations were conducted for three other modules. For the modules dealing with axial loading and stress transformations, we found no statistically significant differences between users and non-users for any learning objectives. For the module dealing with internal forces and moments in three-dimensions and the associated stresses at various points in the cross-section, we found that the difference in performance between users and non-users was statistically significantly for all objectives (users performing better than non-users).

In the course of the extensive evaluations and informal discussions with students, we have come to believe that the use of StressAlyzer courseware alone, as with any teaching method, is insufficient to meet the learning needs of all students, or even to meet all learning needs for any one student. Indeed, we believe that optimal results are achieved when the courseware is used in combination with, rather than in lieu of, classroom instruction and regular, pencil-and-paper problem sets.

For many students, writing on paper is critical to the learning process. For example, students wanted practice separating a body at a section and actually drawing a free body diagram on paper. Also, students often benefit from doodling and figuring as they work. When using the courseware alone, some students found themselves encouraged to do too much problem solving in their heads, where they were more prone to err. When this habit or some confusion about the interface led a student to make a series of repeated errors, frustration occasionally caused them to guess random answers rather than trying to diagnose the problem. The courseware also posed some difficulty for students who needed a written record of all the steps they had completed, to keep track of what they were doing in a problem. Using courseware in conjunction with traditional assignments, as explained at the outset, is also necessary if students are to work on problems that depict a variety of applications, something the courseware was never designed to do.

When surveyed about implementation, faculty (other than the developer) who used the courseware in their classes (during the development and test periods) reported that the time investment required of them to incorporate the courseware was relatively small. On average, faculty spent 5.6 hours (total) using the modules, determining how they worked and how they would use them in their courses. On average, faculty spent 1.8 hours (total) helping students to use the courseware. Finally, when asked to what extent they needed to modify existing syllabi, lesson plans, assignments, or grading policies, in order to include the courseware, most faculty reported that they made only slight changes, or none at all.

VIII. CONCLUSIONS

Problem solving courseware modules have been developed to enrich the homework experience in mechanics of materials courses. While textbook problems typically offer small doses of a wide range of configurations, the courseware takes a complementary approach of enabling students to work with a limited set of configurations in great depth. Problem solving activities in the courseware take advantage of the unique capabilities of the computer, in particular, the ability to generate an unlimited number of similar practice problems (or segments thereof), to provide instantaneous, personalized feedback, and to use graphics and animation to clarify hard-to-visualize phenomena.

Based on extensive assessment activities, we found that the overwhelming majority of student users were enthusiastic about using the courseware and that they viewed it as beneficial to their learning. Perhaps most telling, nearly half of the students chose to use the courseware beyond the required assignment. Analysis of survey responses revealed a number of reasons why students found the software beneficial, with many of those reasons common to a number of students. The most frequently cited reasons were the forced repetition, with immediate feedback on answers, until such point as students could solve problems successfully on their own. Quantitative data on educational outcomes derived from pre- and post-tests indicated greater learning gains for users compared to non-users in most cases. However, the differences were statistically significant for only some of the designated educational outcomes.

While we are skeptical of claims that the computer can effectively take over virtually all aspects of the educational experience to the exclusion of contact with instructors and colleagues, we do believe that the computer can be an extremely potent tool for very specific educational purposes. The potential of the computer to offer new kinds of problem solving/learning experiences is only just being uncovered.

ACKNOWLEDGMENTS

We are grateful to Marina Pantazidou for her valuable comments and suggestions on this paper. The development of the courseware StressAlyzer and its assessment has been supported by the National Science Foundation under grant DUE-9950938, by a Philip L. & Marsha Dowd Fellowship, by the Westinghouse Foundation, the Center for Innovation in Learning and Carnegie Mellon University.

REFERENCES

[1] Flori, R.E., M.A. Koen, and D.B. Oglesby, “Basic Engineering Software for Teaching (BEST) Dynamics, “Journal of Engineering Education, Vol. 85, 1996, pp. 61-67.

[2] Kuznetsov, H., “Innovative Multimedia Instruction and Sophisticated Problem-Solving Exercise and Testing in Engineering Statics and Structural Planning,” Computer Applications in Engineering Education, Vol. 4, No. 1, 1996, pp. 61-66.

[3] Holzer, S.M., and R.H. Andruet, “Experiential Learning in Mechanics with Multimedia,” International Journal of Engineering Education, Vol. 16, No. 5, 2000, pp. 372-384.

[4] Gramoll, K.C., “Interactive Beam Analysis Program for Engineering Education,” Computer Applications in Engineering Education, Vol. 1, No. 6, 1993, pp. 469-476.

[5] Cooper, S.C., and G.R. Miller, “A Suite of Computer-Based Tools for Teaching Mechanics of Materials,” Computer Applications in Engineering Education, Vol. 4, No. 1, 1996, pp. 41-49.

[6] Rossow, M.P., “An Interactive Program for Teaching Stress Transformation with Mohr’s Circle,” Computers in Education Journal, Vol. 6, No. 4, 1996, pp. 42-46.

[7] Rossow, M.P., “The Mohr’s Circle Suite of Programs,” Computers in Education Journal, Vol. 7, No. 3, 1997, pp. 37-42.

[8] Rossow, M.P., “V and M-Computerized Exercises for Shear and Bending-Moment Diagrams,” Computers in Education Journal, Vol. 10, No. 3, 2000, pp. 46-52.

[9] Philpot, T.A., “MDSolids: Software to Bridge the Gap between Lectures and Homework in Mechanics of Materials,” International Journal of Engineering Education, Vol. 16, No. 5, 2000, pp. 401-407.

[10] Staab, G., and B. Harper, Use of Computers in Mechanics Education at Ohio State University, International Journal of Engineering Education, Vol. 16, No. 5, 2000, pp. 394-400.

[11] Laurillard, D., Rethinking University Teaching: A Framework for the Effective Use of Learning Technologies, Routledge, London, 1993.

[12] Flori, R.E. Jr., “Perspectives on the Role of Educational Technologies,” Journal of Engineering Education, Vol. 86, 1997, pp. 269-272.

[13] Reed, S. K., Cognition: Theory and Applications, 5^sup th^ edition, Wadsworth Publishing Company, Belmont, California, 2000.

[14] Larkin, J.H., J. McDermott, D.P. Simon, and H.A. Simon, “Expert and novice performance in solving physics problems,” Science, Vol. 208, 1980, 1335-1342.

[I5] Chi, M., Feltovich, and R. Glaser, “Categorization and representation of physics problems by experts and novices,” Cognitive Science, Vol. 5, 1981, 121-152.

[16] Anderson, J.R., and K.A. Gluck, “What role do cognitive architectures play in intelligent tutoring systems?” in Cognition and Instruction: Twenty-five Years of Progress., ed. Carver, S.M. and Klahr, D., 1980, Erlbaum, Mahwah, NJ, 2001.

[17] , accessed May 6, 2003.

[18] Steif, P.S., StressAlyzer, Brooks/Cole, Pacific Grove, California, 2003.

[19] Patton, M.Q., Qualitative Research & Evaluation Methods, 3rd ed.”, Sage Publications, Thousand Oaks, California, 2002.

PAUL S. STEIF

Department of Mechanical Engineering

Carnegie Mellon University

LARISA M. NAPLES

Educational Evaluation Consultant

AUTHORS’ BIOGRAPHIES

Paul S. Steif is Professor of Mechanical Engineering at Carnegie Mellon University. He holds degrees from Brown University (Sc.B., 1979) and Harvard University (M.S., 1980; Ph.D. 1982). Currently, his research is aimed predominately at developing new educational materials and approaches for improving student learning in engineering. He served as the principal investigator on the project in which the courseware described here was developed.

Address: Department of Mechanical Engineering, Carnegie Mellon University, Schenley Park, Pittsburgh, PA, 15213; telephone: 412-268-3507; fax: 412-268-3348; e-mail: steif@cmu.edu.

Larisa M. Naples is an independent educational evaluation consultant with ten years experience designing, teaching, and evaluating educational programs. She specializes in math, science, and engineering education, with an emphasis on assessing educational outcomes and diagnosing the reasons behind educational successes and failures. Dr. Naples holds degrees in materials engineering, science education, and engineering and public policy. She is an active member of the American Evaluation Association.

Address: 2533 Lincoln Avenue, Belmont, CA, 94002; telephone: 650-637-1298; e-mail: evaluations@larisanaples.org

Copyright American Society for Engineering Education Jul 2003

Provided by ProQuest Information and Learning Company. All rights Reserved