Relationship between PDA usage and student performance in an introductory engineering course, The

relationship between PDA usage and student performance in an introductory engineering course, The

Doolen, Toni L


This research focuses on the development of a methodology to evaluate student attitudes towards technology in the classroom and the impact of this technology on student learning. A survey was developed and tested to evaluate the impact of introducing Personal Digital Assistants (PDAs) in a traditional college classroom setting. PDAs were introduced in an introductory course in the College of Engineering at Oregon State University. A reliable attitude assessment tool was developed as a result of this research. Initial results of this study also provide empirical data that engineering students respond favorably to the introduction of PDAs in a traditional classroom setting. Preliminary results also provide limited evidence that student attitudes may vary based on gender, age, and/or ethnicity. Standard student performance metrics (course assignment and exam scores) and student serf-evaluations were used to assess the impact on student learning and are discussed.


Research examining the impact of hand-held computers or Personal Digital Assistants (PDAs) on both teaching and learning in the classroom is scarce. The absence of this type of research is due, in part, to the emergent nature of the technology. Not surprisingly, much of the literature is centered on issues associated with application development and implementation. Recent reports summarizing the use of PDAs in the classroom highlight the need for research focused on assessing both the impact of technology on student learning as well as the impact on teaching methods. Some educators see PDAs as a viable alternative to either supplement or augment the lab portion of a particular learning experience [1].

Shotsberger and Vetter discuss a cooperative effort between the University of North Carolina at Willmington, Pearson Education (Prentice Hall), and Hypercube to facilitate learning of abstract scientific and mathematical concepts [2]. This research describes the applications and the potential for using mobile wireless technology to overcome both facility and budget limitations. Kabara, Krishnamurthy, and Weiss report on the use of handheld wireless computers in the School of Information Sciences at the University of Pittsburgh [3]. In the classroom, students can access electronic libraries, Internet databases, personal files, real-time data, and laboratory equipment. While PDAs provide mobility for students, they also introduce a significant challenge for educators. The challenge for educators is to determine how to effectively integrate these tools with other educational resources. Avanzato reports on using PDAs in classes at the University of Pennsylvania [4]. The college has used PDAs donated by 3COM in an introductory Information Systems course, in robotics course projects, in testing software for a course in French grammar, and in a digital design class. Early results support the use of PDAs to facilitate active and collaborative learning in the classroom. Ray and MacFadden suggest that PDAs can provide a vehicle for reducing the amount of paper and for centralizing assignment information that can benefit students [1]. A recent report by Cook details the development of an application to enable Web testing using PDAs [5]. The author concludes that developing on-line testing is difficult. The existence of on-line testing using PDAs does, however, provide a vehicle in which educators at any level can integrate technology into the classroom.

Previous research can indicate when and where mobile wireless technology has the greatest opportunity to positively impact student learning. Pownell and Bailey, for example, suggest that PDAs are only effective when they support how instructors use information in their classrooms [6]. Soloway indicates that PDAs “support cycles of doing and reflecting” [7]. The authors found that PDAs encourage instructors and students to review their written work more effectively and frequently. Ray and MacFadden and Soloway found that PDAs provide more flexibility in managing classroom assignments and creating instructional plans [1, 7]. Soloway also reported an increase in quality of written work when students were able to “beam” drafts of their work to the instructor and engage in reflective discourse [7].

The use of mobile wireless PDAs in the classroom provides educators with many potential ways to innovate both curriculum development and class management. A critical underlying question to these innovations is how to determine the impact on students. Further research is a necessary step in providing educators with the foundational knowledge needed to determine how to best integrate technology in their classroom. In addition to understanding the overall impact of this technology on education, there is also a need to understand if all students respond similarly to this type of classroom innovation. This research seeks to develop a reliable instrument to assess student attitudes towards mobile wireless PDAs in a traditional classroom. The study also seeks to determine whether or not various student populations respond similarly to the introduction of this technology in the classroom.


A. Survey Development

Student attitudes towards classroom innovations are one important factor to consider in assessing the impact on student learning. A survey was developed to measure student attitudes toward both PDAs and the usage of PDAs in the classroom. The survey was designed to assess attitudes towards PDAs in six different areas-confidence, liking, anxiety, enthusiasm, usefulness in general, and usefulness in the classroom. A variety of research has been undertaken to study both teacher and student attitudes toward the integration of information technology in the classroom. As a result, items from previously published surveys evaluating similar constructs were used wherever possible.

Loyd and Gressard developed a Computer Attitude Scale (CAS) to measure four constructs related to the use of information technology, such as computers, in education [8]. Twenty-two items from the CAS were included in the survey developed for this research. The language of some of these items was modified to match the targeted population (pre-engineering students likely to have previous computer exposure). The items taken from the CAS were also modified to refer to PDAs and the use of PDAs rather than computers. The items from CAS were supplemented with thirteen items developed by the authors. Six of these items were developed to assess student perceptions on the usefulness of the PDAs in the context of the course and learning materials directly related to the course. The final set of items included in the survey was focused on assessing the level of enthusiasm that students had for using the PDAs. The enthusiasm scale was developed to determine the extent to which students responded positively towards having the opportunity to use PDAs. The items for the enthusiasm scale were developed based on a Teacher’s Attitude towards Computer Survey (TAC) developed by researchers at the Texas Center for Educational Technology [9]. TAC items were also modified to refer to PDAs rather than to computers. All items are summarized in Table 1.

A five-point Likert scale (strongly disagree, disagree, neutral, agree, strongly agree) was used for all survey items except for demographic information (gender, age, and ethnicity). Responses were coded so that a higher response was indicative of a lower level of anxiety, and a higher level of confidence, liking, enthusiasm, and usefulness. In addition to the six dependent variables tested, independent variables including gender, age, ethnicity, and plans to continue in engineering were also collected using the same survey. The final survey included a total of 43 items. Survey items were assigned a random number to determine the ordering of items in the survey. No student identification information was included on the survey.

B. Data Collection

A freshman level engineering course at Oregon State University was used as a test vehicle to gain a better understanding of the impact of incorporating wireless PDAs into the classroom. The course is the second course in a two-term series designed to introduce engineering students to basic engineering concepts. Students typically take the course during the second or third quarter of the freshman year. The overall emphasis of the course is engineering problem solving using computers. The course consists of two 50-minute lectures and one two-hour lab. Lectures are held in a traditional auditorium style-seating environment.

Students utilized PDAs during the lecture portion of the course. Students were given a PDA at the beginning of approximately eight class sessions (out of a total of 20 scheduled sessions) during the term. Students followed the instructor-led discussion of specific problems relevant to the particular topic being discussed. Students solved example problems on the PDAs. For example, during the class sessions covering engineering economics, students would be given spreadsheets containing cash flows for engineering projects and would use the PDAs to determine internal rates of return and net present values. The instructor developed all example problems used during the course. All example problems required Microsoft Excel and its associated functions.

The course enrollment at the time of the study was 72 students. All students enrolled in the course were given the opportunity to complete the attitude survey at the end of the ten-week term. The survey was given to students during regularly scheduled class hours.

C. Student Achievement Metrics

One method to study the effect of PDAs in instruction is to compare the performance of students using student work samples. Student performance from the targeted class was compared to student performance in a previous offering of the course when PDAs were not used. Two different data sets were utilized to provide comparative measures of student achievement. The first achievement metric was student performance on the midterm exam completed by students in the fifth week of the course. PDAs were used in the targeted course only during the first half of the term making the midterm the best overall achievement metric to analyze for this research project. A second measure of student achievement was student self-assessment data for course learning objectives. Three specific learning objectives were related to solving problems using Microsoft Excel. Since the PDAs were used to formulate and solve problems in Excel, these learning objectives were most relevant to assessing student achievement related to the use of PDAs. Student achievement measures were tested against the same performance measures taken from students completing the course in the spring term of the previous year. Independent samples t-tests were performed to compare differences between the two sets of students for both achievement measures.

D. Sample Description

The student demographics for the comparative course in the previous spring term were quite similar to demographics of students enrolled in the targeted course, as shown in Table 2. While the exact same midterm exam was not used in both terms, the exams were structured similarly. In addition, students used the same computer laboratory facilities, a similar overall syllabus, and the same textbook for both terms being compared. The most significant uncontrolled variable was the instructor and the graduate teaching assistant. Although the instructor was not the same, both instructors used the same materials for lectures as well as the same lecture structure and format.


A. Attitude Measures

Fifty students, corresponding to a response rate of 69 percent, completed surveys. The number of surveys used for determining average scale values varied slightly due to incomplete surveys. If the student did not complete all items for a scale, the data for that scale was not used in any results evaluation or analyses. The means and standard deviations for the six scale scores as well as correlations among scales are summarized in Table 3. Reliability coefficients (Cronbach’s Alpha) were calculated for each scale and are also summarized in Table 3. A Cronbach’s alpha greater than 0.7 is often the value used to accept a scale as reliable: Based on this criterion, reliabilities were acceptable for all scales.

The overall response by students was quite favorable. An average score over 3.0 would indicate that students felt fairly confident about their ability to use PDAs, were enthusiastic, and enjoyed using the PDAs. Student anxiety received the highest average score, indicating that overall, the PDAs did not engender high levels of anxiety for most students (note: high ratings on the anxiety scale indicate a low level of anxiety). Responses to both sets of items focused on the usefulness of the PDAs also were indicative of a positive student attitude.

Having determined that the survey demonstrates acceptable levels of reliability, attitude data was further analyzed using independent samples t-test. The focus of this analysis was to determine whether or not different student populations responded similarly to the use of wireless PDAs in the classroom. The independent variables were gender, age, and ethnicity. The dependent variables were confidence, liking, anxiety, enthusiasm, general usefulness, and usefulness in the course where the PDAs were introduced. Levene’s test for equality of variance was also completed for all subgroups in each analysis. There was no evidence for unequal variances except for the usefulness scale related to the course when the sample was stratified by gender.

Subgroup means and t-test results by gender (male or female) are summarized for each scale (confidence, liking, anxiety, enthusiasm, general usefulness, and usefulness in the course) in Table 4. Although no significant differences in attitude were found for any of the scales, the average scores for women were lower than the average score for men for all six scales. Additional research with a larger sample size may be required to find significant differences, should they exist.

Subgroup means by age (23) and t-test results are summarized in Table 5. Even though the number of non-traditional students (those students who responded that were over 23 years old) was small (n = 3), the older students were less enthusiastic about the PDAs than their younger classmates. The resulting t-value for a difference in means was significant to the 0.05 level. Additional research including a larger sample size and focus groups are needed to confirm this finding and to gain a better understanding of the reason for the observed difference in attitudes.

Subgroup means by ethnicity and t-test results are summarized in Table 6. Ethnicity categories specified by the university were used on this survey. The number of students who identified their ethnicity as white, European American, or non-Hispanic was 27. Only eight students self-identified in the other ethnicity categories. Independent sample t-tests provide evidence for a significant difference in means between the 27 white, European American, or non-Hispanic students and the eight remaining students on the confidence scale (p-value

B. Achievement Measures

Midterm and student self-assessment data for course learning objectives for the control case course taught in Spring 2001 and the experimental course taught in Winter 2002 are summarized in Table 7. An independent samples t-test was completed. Significant differences were found for both the midterm exam score and the learning objective ratings. No other statistically significant relationships were identified. Since the instructor was not the same for the control and experimental offerings of the course, it is not possible to rule out alternate explanations for the differences in achievement metrics that were observed. However, there is evidence to suggest that the use of PDAs in this particular course may be related to student achievement as measured by a mid-term exam and by student self-assessments of their understanding of specific materials called out in course learning objectives. Additional research, utilizing a more controlled design would be necessary to provide stronger evidence that the introduction of technology in the classroom can lead to improved student learning as measured by student performance on exams.


A reliable tool for assessing student attitudes towards the integration of PDAs in a freshman-level engineering course was developed. The resulting tool is the first step in providing instructors an avenue to assess student attitudes on the introduction of PDAs into traditional college-level classrooms. In general, the results from this initial study suggest that students in the course studied had positive attitudes towards PDAs when utilized in a traditional classroom setting. On all scales, the mean scores were between 3.64 and 4.08 on five-point scale where a score of 3.0 would indicate a neutral attitude. Age was found to be significantly related to enthusiasm. This was a particularly notable result because such a small number of students self-identified as being older than 23 years. Ethnicity was found to have a statistically significant relationship with confidence. This finding cannot be distinguished from other potential confounding sources such as computer experience or socio-economic background of the subgroup. Although no statistically significant relationships were identified between gender and attitudes, the trends of the mean do show lower average scores on all six scales for women. Previous research on the role of gender in computer attitudes has been mixed. These results from this study suggest that additional research may be necessary to rule out a relationship between attitudes and gender.

Student performance metrics also supported a relationship between the introduction of PDAs in a traditional classroom and improved student performance as measured by scores on a midterm exam and student self-evaluation of learning objectives directly tied to the portions of the course in which PDAs were used. Because this research was carried out in actual classroom settings, variables (such as instructor or the actual exam used) may provide alternate explanation for the differences seen. However, these findings do point to the need to further study to provide a deeper and broader understanding of the potential for relationships between the introduction of technology in the classroom and student performance.


The authors wish to acknowledge the support of the Hewlett-Packard Company in providing the equipment resources and funding necessary for this research. This study was also funded in part by the Tuthill Scholarship Program at Oregon State University.


[1] Ray, B., and MacFadden, A., “PDAs in Higher Education: Tips for Instructors and Students,” Journal of Computing in Higher Education, vol. 13, no. 1, Fall 2001, pp. 110-118.

[2] Shotsberger, P., and R. Vetter, “Teaching and Learning in the Wireless Classroom,” Internet Watch. March 2001. .

[3] Kabara, J., P. Krishnamurthy, and M. Weiss, “Use of Wireless Computers in the Undergraduate and Graduate Classroom,” ASEE/IEEE Proceedings-Frontiers in Education, 30th Annual Conference, 2000, vol. 2, Session F2D.

[4] Avanzato, R., “Student Use of Personal Digital Assistants in a Computer Engineering Course,” ASEE/IEEE Proceedings-Frontiers in Education, 31^sup st^ Annual Conference, 2001, vol. 2, Session F1B.

[5] Cook, R.P., “The National Classroom Project-An Experience Report,” ASEE/ISEE Proceedings-Frontiers in Education, 30^sup th^ Annual Conference, 2000, vol. 1, Session T1E.

[6] Pownell, D., and G.D. Bailey, “The Next Small Thing: Handheld Computing for Educational Leaders,” Learning and Leading with Technology, vol. 27, no. 8,2000, pp. 46-49,59.

[7] Soloway, E. “Supporting Science Inquiry in K-12 Using Palm Computers: A Palm Manifesto.” Center for Highly-Interactive Computing in Education. 2000. .

[8] Loyd, B.H., and C.P. Gressard, “Gender and Amount of Computer Experience of Teachers in Staff Development Programs: Effects on Computer Attitudes and Perceptions of the Usefulness of Computers,” AEDS Journal, vol. 18, Summer 1986, pp. 302-311.

[9] Knezek, G., and R. Christensen, “The Teachers’ Attitudes Toward Computers Questionnaire,” Texas Center for Educational Technology (TCET), University of North Texas, 1998, .

[10] Lawton, J., and V. Gershner, “A Review of the Literature on Attitudes towards Computers and Computerized Instruction,” Journal of Research and Development in Education, vol. 16, no.1, 1982, pp. 50-55.


Department of Industrial and Manufacturing Engineering

Oregon State University


Department of Industrial and Manufacturing Engineering

Oregon State University


Department of Science and Mathematics Education

Oregon State University


Dr. Toni L. Doolen is an Assistant Professor in the Industrial and Manufacturing Engineering Department at Oregon State University. She teaches courses in manufacturing, management systems engineering, human factors engineering, and industrial engineering. Dr. Doolen received a B.S. in Material Science and Engineering and a B.S. in Electrical Engineering from Cornell University in 1987. She received an M.S. in Manufacturing Systems Engineering from Stanford University in 1991. She received her Ph.D. in Industrial Engineering from Oregon State University in 2001. Her research is focused on manufacturing systems design, lean manufacturing, work group effectiveness, mobile technology in education, error management and reduction, and survey design and methodology.

Address: 118 Covell Hall, Oregon State University, Corvallis, Oregon, 97331-2407; telephone: 541-737-5641; e-mail:

Jim Hoag is a doctoral student in the Science and Mathematics Department at Oregon State University. Mr. Hoag received is B.S in Chemistry from Framingham State College. He received an M.S. in Math from Northern Arizona University in 1986 and an M.S. in Computer Science from Oregon State University in 1997. Mr. Hoag taught in the Computer Science department at Western Oregon University from 1989 until 2001, serving as Director of Computer Services from 1997-2000. He is currently a visiting instructor of Computer Science at Colby College in Maine. His research interests are technology use in instruction and Computer Science education.

Address: 404 Mudd, Colby College, Waterville, Maine, 04901; e-mail:

Dr. J. David Porter is an Assistant Professor in the Industrial and Manufacturing Engineering Department at Oregon State University. He teaches courses in information systems engineering and industrial engineering. Dr. Porter received his B.S. in Mechanical Engineering from the Universidad Autonoma de Nuevo Leon (Mexico) in 1991 and his M.S. in Manufacturing Systems from the Monterrey Institute of Technology (Mexico) in 1994. He also received his M.S. and Ph.D. in Industrial Engineering from the University of Pittsburgh in 1999 and 2000, respectively. Dr. Porter’s research interests include automatic identification and data capture, intelligent transportation systems, supply chain engineering, wireless communications, and manufacturing systems engineering.

Address: 118 Covell Hall, Oregon State University, Corvallis, Oregon, 97331-2407; telephone: 541-737-2446; e-mail: david.

Copyright American Society for Engineering Education Jul 2003

Provided by ProQuest Information and Learning Company. All rights Reserved