Teaching critical thinking online

Teaching critical thinking online

Hermann Astleitner

Critical thinking is a higher-order thinking skill which mainly consists of evaluating arguments. It is a purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanations of the evidential, conceptual, methodological, or contextual considerations upon which the judgment is based. As, for several reasons, critical thinking is not integrated within traditional classroom instruction, it is an interesting question, whether critical thinking can be trained with computer-based instruction (CDROM- and web-based teaching). Within a first part, the presented paper offers a narrative literature review on the effects of cognitive tools, of collaborative computer-supported environments, of computer simulations, and of logic-software on critical thinking. Within a second part, two experimental studies are reported. Students were instructed in critical thinking by web-lectures. In experiment 1, one group of students was confronted with audio recordings and an another group with video recording of a lecture dealing with non-formal errors in arguments. Within both groups, about one half of students was presented synchronous organizers (text, figures, etc. on MS-Powerpoint slides), and the other half of students did not get such organizers. Results showed that synchronous organizers influenced subjective evaluations of the learning process and outcome. The modality of the recordings influenced learning transfer favoring the audio condition. In experiment 2, an audio web-lecture with synchronous organizers was compared with traditional text-based instruction. Results showed no differences in scientific analytic reasoning. Discussions were based on cognitive and motivational principles of multimedia learning.


In the era of the Internet and of information society, “critical thinking” represents a major qualification. In general, “critical thinking” is a mental activity of evaluating arguments or propositions and making judgments that can guide the development of beliefs and taking action. Gilster (1997, p. 87) regarded critical thinking as the most important skill when using the Internet, because the Internet is full of false, incomplete, obsolete, etc. information. Reinmann-Rothmeier and Mandl (1998, p. 33) found in a Delphi-study, that experts from economy and education nominated critical thinking as the most important skill in knowledge management. Kraak (2000, p. 51) saw critical thinking as “an important, perhaps the most important of all present time educational tasks”. Within these superlatives, the (not new) appeal to schools is hidden to educate “critical students” (Lang, McBeath, & Hebert, 1995). For achieving this complex goal, schools and teachers have to be assisted from educational theory and research.

Educational research activities showed that critical thinking is significantly anchored within curricula and related teaching goal taxonomies, but that it is not supported and taught systematically in daily instruction (Patry, 1996, p. 63). The main reasons for this shortcoming are that teachers are not educated in critical thinking, that there are no textbooks on critical thinking available (especially for most European countries), and that teachers have no time and other instructional resources to integrate critical thinking into their daily instruction (Astleitner, 1998, 2000a; Petri, 2000). This shortcoming counts a lot, because critical thinking is highly correlated with students’ achievements. Frisby (1992) reported correlation coefficients of about .40 with the US-school achievement test (SAT). Also, Yeh and Wu (1992) found similar correlation coefficients with other standardized school achievement tests and grades. Very high correlation coefficients ranging from .45 to .47, or effect sizes larger than 1 were reported for mathematics and science instruction. These correlations have to be considered in educational research, even though they can be explained to some degree with the moderating effect of student’s intelligence. But, for example, in recent attempts to cope with the TIMS (Third International Mathematics and Science)-study shock, these correlations between critical thinking and school achievements were not considered significantly (see, for example, the German Science Foundation Project “Quality of Schools”).

Neglecting “critical thinking” as an important source of improving school achievement took place, although many theoretical approaches on “critical thinking” were developed. Noteworthy approaches came from Ennis (1962) concerning 12 specific “aspects”; from Beyer (1990) with 6 general “elements”; from Paul, Binker, Jensen, and Kreklau (1990) including 35 “dimensions”; from Clark and Biddle (1993) containing 4 “processes”, and finally from Jonassen (1996) including 15 “sub skills”. But, at the moment, there is no theoretical approach which can integrate, at least to some degree, the given theoretical models, or evaluate their relevance for daily instruction. There is also no approach available which can delimitate critical thinking from other higher-order thinking skills like, for example, “creative thinking”, “problem solving”, or “decision making”.

Even though an integration of existing theoretical approaches on critical thinking is still missing, it is nevertheless possible to describe what students (and teachers) have to know for being successful in critical thinking. For that, a review of educational research on critical thinking from Dick ( 1991 ) can be used. This review results in a taxonomy of critical thinking and summarizes relevant research from the last forty years (see Table 1).

According to this taxonomy, critical thinking consists of identifying and analyzing arguments, of considering external influences on arguing, of scientific analytic reasoning, and of logical reasoning. As relevant synonyms for this definition of critical thinking, also “everyday reasoning”, “informal reasoning”, or “pragmatic reasoning” were used (e.g., Galloti, 1989; Shaw, 1996; Walton, 1989). There is also a close connection to research from Cognitive Psychology, Philosophy, and Linguistics referring to “inductive reasoning”, “deductive reasoning”, “causal reasoning”, to “abductive reasoning/inference”, to “Bayesian reasoning”, to “probabilistic reasoning”, to “syllogistic reasoning”, to “nonmonotonic reasoning”, to “adaptive thinking”, or to “intuitive judgment” (e.g., Chater & Oaksford, 1999; Cheng & Holyoak, 1985; Gigerenzer, 2000; Gigerenzer & Hoffrage, 1995; Josephson & Josephson, ,1994; Klahr & Dunbar, 1988; McKenzie, 1994).

In the field of education and instruction, this kind of research and related approaches were used to develop programs for promoting thinking skills in students (e.g., Hager, 1995; Klauer, 1993; Mandl& Friedrich, 1992). But, only very few of these programs realized a comprehensive “critical thinking program” in a way that is actually suggested by educational researchers and instructional designers (Halpern, 1998; Maiorana, 1992). Such programs for promoting critical thinking should have the following features: 1) they should consider a disposition or an attitude against critical thinking; 2) they should regard critical thinking as a general skill that must be deepened within different subject matters or contexts; 3) they should offer a segmented and instructionally fully developed training in specific skills; 4) they should focus on all (or many) relevant subskills of critical thinking and integrate them; 5) they should include parts for stimulating the transfer of knowledge; 6) they should support meta-cognitive skills for assisting self-regulation activities; 7) they should not include formal, mathematical, etc. algorithms, but everyday language problems; 8) they should train students for a several week’s or month’s period; and 9) they should consider the organizational context of classroom instruction.

Many of the past programs for promoting critical thinking not adequately considered these features what was shown by comprehensive literature reviews from Nickerson, Perkins, and Smith (1985), from McMillan (1987), from Pascarella and Terenzini (1991) and recently from van Gelder (2000a). Therefore it is not surprising, that, according to these reviews, promoting critical thinking in daily instruction was not successful in the past despite considerable activities in research and practice. This tremendous disaster was the main reason why a Delphi-study was conducted, in which 46 experts in research on thinking had to define critical thinking (APA, 1990). Based on this definition and in order to stimulate united research activities, several instruments for measuring different aspects of critical thinking were developed and validated: the “California Critical Thinking Skills Test (CCTST)” for measuring general critical thinking skills (Facione & Facione, 1992); the “California Critical Thinking Dispositions Inventory (CCTDI)” (Facione, Facione, & Giancarlo, 1992) for measuring relevant dispositions and attitudes; and the “Holistic Critical Thinking Scoring Rubrib (HCTSR)” (Facione & Facione, 1994) for considering the subject matter or context in which critical thinking skills are applied. Whether this extraordinary canalization of research will be accepted by the scientific community, will show within future developments. But, up to now, within many well-founded research activities on critical thinking, this line of research was ignored (e.g., Sa, West, & Stanovich, 1999).

Overall, it seems very difficult to successfully implement critical thinking into traditional classroom instruction. When traditional classroom instruction do not work, then it is obvious to ask for alternative classroom scenarios. In such scenarios, the teacher should be assisted by some additional help or the students should be able to work for their own and therefore release the teacher from some duties. Such assisting and releasing functions can be realized by computer-based instruction, especially CDROM- and Internet-based instruction. CDROM- and Internet-based instruction showed to be successful for learning in general and for lower order thinking skills in several literature reviews (Astleitner, 2000b; Dillon & Gabbard, 1998; Saba 2000). But, such reviews were not yet made for higher order thinking skills, like critical thinking. It is an open question, whether CDROM- and Internet-based instruction can successfully promote critical thinking in daily instruction. To try to answer this question in a narrative literature review, represents the central task for the following first section of the paper.

State-of-the-Art: Can CDROM- and Internet-based Instruction Promote Critical Thinking?

In order to answer this question, different types of general instructional functions which can be delivered by new media (i.e., CDROM or Internet) have to be differentiated. There are three different types: a) “new media without any instructional functions”: the media itself represents the content or can act as a tool for solving given tasks, but the teacher remains the central instructor within the classroom; b) “new media with indirect instructional functions”: the media can deliver some instructional functions (e.g., presenting the content), but the teacher controls and evaluates the learning process; and c) “new media with direct instructional functions”: the media can deliver many instructional functions and can also control the learning process.

New Media Without Any Instructional Functions: As Content or as Tool of Critical Thinking

Jonassen (1996) postulated that new media can be used as content (for teaching about new media) and as tool (for problem solving) in order to stimulate and support critical thinking. Duffelmeyer (2000) pointed out that “criticizing technology” or “the society in general” during daily instruction makes it possible to teach critical thinking and to use new media especially as content of critical thinking. This approach is anchored within the research paradigm of “critical theory” and/or of “qualitative social research” what results in the fact, that no empirical data are available to proof the underlying assumptions (see URL http:// www.critical-thinking.org). Therefore, from a social scientists point of view, this line of research has little to offer, although there is a relation to evaluation research focusing on “Internet quality”. Within this area of evaluation research, several comprehensive instruments for criticizing contents of the Internet were developed (e.g., Wilkinson, Bennett, & Oliver, 1997). But, these instruments and their application were not yet proven to promote critical thinking.

Reimann and Bosnjak (1998), however, delivered some empirical data about the efficiency of computer tools for critical thinking. They used hypertexts as a tool to stimulate and guide critical thinking. In their study, students had to criticize and to expand an argument structure and had free access to a content-rich hypertext. But, using the hypertext did unexpectedly not improve critical thinking. The authors of this study concluded that it is not sufficient to offer content information, but that critical thinking has to be supported by carefully designed instructional activities. This assumption is also confirmed by a study from Glebas (1997), in which another computer tool, a spreadsheet, was found to be ineffective for critical thinking when it is not integrated within an instructional context. Scarce (1997) found that the use of Email–as communication tool without any further instructional function–did not improve critical thinking in comparison with traditional classroom instruction. Santos and DeOliveira (1999) found similar non-significant results when using the Internet as tool for content presentation.

Overall, it is obvious and not a surprise, that new media without any instructional functions cannot successfully promote critical thinking. Being critical about something (e.g., the Internet) and having some tools available (e.g., Email) does not at all guarantee critical thinking. Rather, critical thinking is a higher order thinking skill that only appears when students are trained based on specific sub-skills and related instructional activities.

New Media With Indirect Instructional Functions: Collaborative Learning and Critical Thinking

Within traditional learning environments, in contrast to many other findings, positive effects of collaborative learning on critical thinking were reported (e.g., Gokhale, 1995). These results are mainly due to the fact that carefully designed collaborative learning generally delivers many different point-of-views, and therefore many different learning experiences and multi-faceted learning support.

Newman, Johnson, Cochrane, and Webb (1996) compared a traditional course with a course in which an Internet-based discussion forum for assisting collaborative learning was used. They found that using the discussion forum resulted in better critical thinking, because students had more learning materials available and related more often their arguments to each other. Overall, students in the discussion forum condition experienced more learning opportunities than students in the traditional course. Despite this remarkable result, this study tells nothing about the design of learning environments for promoting critical thinking. Bullen (1998) delivered more background knowledge about the design of learning environments based on students’ surveys about using an Internet-based discussion forum. A content analysis of students’ messages showed, however, that students did not acquire critical thinking. The author gave several reasons for this finding, but without testing them in a controlled setting. Especially, students regretted that there was no possibility to communicate in a synchronic way with each other, because some arguments or problems needed immediate reactions in order to be clarified and transferred to further discussions. Students found it disturbing that their messages were not organized in a hierarchically or chronologically way, so it would have been easier for them to find a learning relevant structure in the messages. Also, students missed specific instructional activities which were related to a certain teaching goal. Sloffer, Dueber, and Duffy (1999) implemented a synchronous and an asynchronous conferencing tool for promoting critical thinking which considered the suggestions given by Bullen (1998). In addition, they stimulated critical thinking by visualizing elements of the critical thinking process. For example, students had to assign to their messages symbols indicating important elements of critical thinking, like “hypotheses” or “evidence”. The authors also implemented a mechanism that only those students could read other messages which accomplished their own duties. Finally, a human tutor had to motivate students. Results showed that many students delivered contributions with high-quality critical thinking content and that almost all students read the messages of the other students. However, this positive result was not confirmed by comparable research studies.

DeLoach and Greenlaw (1999) analyzed the process of critical thinking by focusing on contributions to discussion forums. They found that critical thinking improved in respect to correctness, novelty, complexity, etc. with the continuance of the discussion process. This result was accomplished within an open learning environment (without, for example, a detailed learning goal), but also showed that many of the students’ contributions were not related to the central issues of the discussion process. The authors were not able to state the cognitive, motivational, or emotional conditions that leaded to this finding. McLoughlin and Luca (2000) tried to bring light into this problem. They analyzed at least the cognitive processes interactions which can be identified within the contributions. They found that within a WebCT-based learning environment, students only exchanged their contributions without critically analyzing the contributions of others based on examinations, revisions, or significance negotiations. The following reasons for the missing of critical thinking were given: a) there was no learning guidance (by complex learning tasks); b) students did not get instruction telling them that they should control learning systematically by their own; and c) students did not handle social-emotional problems in a sensitive and responsible way, because students were afraid of hurting others by critical statements or being hurt by others.

To sum up, it can be stated that the effect of collaborative learning with new media on critical thinking, cannot be evaluated properly. The given results show some instructional elements that can help to improve the situation, but these elements have not yet been tested within controlled research. When using this type of new media for promoting critical thinking, then everyone has to be aware of the fact that collaborative learning has to be enhanced by specific critical thinking tasks and that learning in such environments has to be managed comprehensively in respect to time, group meetings, etc. Overall, the state-of-the-art of research on collaborative learning, new media, and critical thinking shows no consistent findings, but it shows that preparing and managing this form of learning require significant additional time resources and advanced technical skills. When having a closer look at the present situation in daily school, then it is not realistic that critical thinking can be promoted by collaborative learning and related media, because the necessary effort in time, preparation, etc. for teachers significantly exceeds the expectable learning effects for students.

New Media With Direct Instructional Functions: Individual Learning and Critical Thinking

There is some evidence within the reported studies that critical thinking can only be promoted successfully by new media when the media delivers, at least, some teaching functions. This also represents an important condition for releasing teachers from some of their duties in daily instruction what shows the practical relevance in respect to new media usage for promoting critical thinking. New media with direct instructional functions can be divided into computer simulations and into drill- or tutorial-based logic software.

Computer Simulations

Although, there are some learning and thinking tasks relevant for students within computer simulations (e.g., testing hypotheses) which are closely related to skills in critical thinking, only very few studies dealing with critical thinking and computer simulations can be found. De Jong and van Joolingen (1998) outlined in their comprehensive review on computer simulations and learning effects that some general personal and situational conditions must be realized in order to be successful in promoting higher order thinking skills. According to these authors, students should dispose of skills in hypotheses testing and in findings relevant knowledge. Also, students should have access to knowledge bases relevant for the subject matter of the simulation. Finally, students should have system guidance when exploring relationships between important variables relevant for modeling the subject matter of the simulation. Such guidance can be given by certain learning tasks and by overviews of relevant variables and their relationships to each other within the computer simulation. Only Rivers and Vockell (1987) and confirming Faryniarz and Lockwood (1992) found, that a computer simulation in the field of Biology and Ecology influenced positively critical thinking (measured with a standardized test), but only when “guided discovery” was realized. Within these two studies, critical thinking was not the main focus of research, but was seen as one component of several higher order thinking skills. Also, Gokhale (1996) did not test the influence of a computer simulation on all relevant aspects of critical thinking. But, he found that an instructionally well-equipped computer simulation in the field of Electronics increased some skills relevant for critical thinking, like analyzing, synthesizing, and evaluating, however decreased lower order thinking skills, like remembering, understanding, and applying. Additionally, Yeh and Strang (1997) only delivered restricted evidence for the effectiveness of computer simulations in promoting critical thinking. They found that young teachers had better skills in teaching (and not learning) critical thinking after using a computer simulation modeling daily classroom problems.

Although, there is no research within this field testing the effects of computer simulations on all critical thinking skills in a theoretically founded and comprehensive way, there are some indications that learning within computer simulations is closely related to information exploration and evaluation skills which are also relevant for critical thinking. However, it must be made clear that these common skills are mainly related to inductive reasoning what excludes many other critical thinking skills, like identifying and analyzing arguments, or considering external influences on knowledge evaluation.

Logic software

Van Gelder (2000b) tested the influence of the software training program “Reason!” (URL http://www.goreason.com) on “informal reasoning”. “Reason!” offers an instructional environment for self-regulated learning, without any human tutor assistance. It can be used for long-term trainings in critical thinking and realizes a learning context embedded within everyday problems. The program graphically illustrates the subject matter, includes several control mechanism (e.g., for building and evaluating argument chains), and delivers constructive feedback to the students. Two pre-posttest studies showed medium effect sizes (0.41 and 0.51), when “Reason!” is compared with traditional classroom instruction in promoting critical thinking. Van Gelder (2000a) also reported a (not yet published) study about the successful use of another logic software called “Lemur” (URL http:// www.wwnorton.com/lemur). This software offers many learning tasks with solutions, quizzes, and also structured overviews of the subject matter. However, this drill-based program can only be used together with an accompanying book from LeBlanc (1998) and a related teacher-centered instruction.

Similar positive results were also reported from Stenning, Cox, and Oberlander (1995) in respect to the logic software “Hyperproof” and from van der Pal and Eysing (1999) related to “Tarski’s World”. However with this kind of software, skills in formal logic are trained which are not closely related to critical thinking, because they do not address everyday language, but mainly mathematically formalized symbols (see an overview of formal logic software for instructional purposes on URL http:// tcw2.ppsw.rug.nl/~hans/ logiccourseware.html). All the formal logic software does not meet the criteria from the critical thinking teaching model by Maiorana (1992) and Halpern (1998), because the thinking skills are not embedded in everyday language and contexts. This circumstance counts even more when considering the result from a study by Cheng, Holyoak, Nisbett, and Oliver (1986) showing that training formal logic does not have a positive influence on everyday reasoning, because in practical contexts skills in “pragmatic” or “informal” reasoning are more relevant.

In contrast to all other ways of using new media for promoting critical thinking, studies on new media with direct instructional functions, to some degree computer simulations but especially logic software, showed at least some effectiveness for promoting critical thinking. But, this preliminary result cannot be nothing more than a first basis for further research activities.

A Framework for Further Research Activities on New Media and Critical Thinking

Of course, within future research activities more experimentally and quasi-experimentally controlled research has to be undertaken in order to increase scientific quality in the field of new media and critical thinking. There are some indications that new media are effective for promoting critical thinking because they are more focused (in the selection and in the presentation of contents and skill trainings), they are more concrete (in using specific tasks for learning), and they can deliver learning relevant feedback more often than traditional classroom instruction. These general assumptions have to be tested in lab and educational practice tests, but they also must be elaborated by relating them to major issues in recent instructional design research and should finally result in the development of a virtual critical thinking school. The following aspects seem to be important for future research on new media and critical thinking.

Linear Combined With Open Learning Environments in Which Cognitive Load is Considered

One important conclusion from new media and critical thinking research is that “linear” program elements, like drills or tutorials, should be combined with elements of “open learning environments”. “Linear” program elements can train specific basic skills in critical thinking with a step-by-step procedure enriched by tasks and feedback. In addition, advanced skills must be acquired: During critical thinking, content-relevant contexts have to be examined, static and dynamic information resources have to be analyzed, tools for information processing, gathering, creating, etc. have to be used, or learning support must be realized. Such tasks can be accomplished within “open learning environments (OLEs)” conceptualized by Hannafin, Land, and Oliver (1999). Such OLEs offer students tasks in given contexts (e.g., case studies in Ecology), additional resources (e.g., databases), tools for searching information networks or making notes, and learning support devices with conceptual, meta-cognitive, procedural, and strategic hints. The necessity of open learning elements for teaching critical thinking is confirmed by above mentioned findings based on collaborative learning environments because collaborative learning activities usually open a learning situation. The call for using open learning environments for teaching critical thinking is not new, but it was not yet realized comprehensively within computer-based learning environments for promoting critical thinking (Mancall, Aaron, & Walker, 1986).

There is research from Arburn and Bethel (1999) indicating that feature’s of open learning environments can successfully influence critical thinking, however that research was embedded within a traditional learning environment. The authors identified increased scores within the California Critical Thinking Test after applying a teaching method which stimulated students to ask questions on the subject matter. Such an effective teaching method can be part of the learning support component of an OLE. Muilenburg and Berge (2000) developed an approach which distinguishes between different types of questions relevant for critical thinking and which includes other elements of OLEs, like case studies, discussion rooms for controversial issues, Internetlinks combined with an evaluation device, and role plays.

OLEs increase, as a rule, the complexity of learning and therefore need additional cognitive resources for navigating, etc. resulting in reduced cognitive capacity for subject matter learning and critical thinking. In order to cope with this important problem, the following theoretical approaches must be considered when designing learning environments for promoting critical thinking: a) the “cognitive load”-theory from Sweller (1994) and related research (e.g., Cooper, 1998); b) the “minimalist instruction” approach from Carroll (1998); and c) the “cognitive theory of learning with multimedia” from Moreno and Mayer (2000). Important principles from these approaches must not be implemented systematically and in detail within the learning environment, but should act as general guidelines for instructional design. In this respect, the approach from Sweller (1994) is specifying instructional elements of texts and the learner-system-interaction; the approach from Carroll (1998) is helpful when handling learning mistakes; and the approach from Moreno and Mayer (2000) delivers guidelines for implementing multimedia elements.

Motivational-Adaptive Instruction

Shapley (2000) found dropout rates of about 30 percent within an online course on critical thinking. As main specific reason for this finding, it is assumed that critical thinking represents a higher order thinking skill with high task difficulty and a high probability of failure what reduces the motivation to learn. High drop out rates within web-based instruction represented a major general problem and were also found within other less complex subject areas resulting from shortcomings within the social-emotional contexts of learning (Astleitner, 2000b). In order to handle the specific and the general problem, the ARCS-model for motivational design of instruction from Keller (1983, 1999) can be used. Chyung, Winiecki, and Fenner (1998) presented a study in which the instructor of a Internet-based distance education course constantly evaluated and redesigned the existing instruction to be more attractive to new distance education students, to be more relevant to their professional interests, and to increase individual students’ confidence levels in learning as well as their satisfaction levels toward the instruction. The redesign of instruction was based on the ARCS-model and consisted, for example, of the following instructional strategies: to break down the instruction into small weekly modules and help students to master one module at a time, to provide students with clear criteria of expected performance levels such as regular participation in class discussions, or to deliver instruction via multiple media such as slide shows, the WWW, and electronic bulletin boards. A year after the interventions were implemented, the dropout rate was cut in half, from 44 to 22 percent. Visser (1998) used a simplified version of the ARCS-model to influence the motivation of distance learners by focusing on the student support system rather than on instructional processes. She implemented a program of “motivational messages” which were sent to students. Theses messages were in the form of greeting cards, which included messages of encouragement, reminders, or advice. The design of the messages was based on predictions concerning students’ pre-course and midterm attitudes toward instruction, students’ reactions to the course content, and characteristics of student’s support during the course. These predictions and related messages were different according the four A(ttention) R(elevance) C(onfidence) S(atisfaction)-parameters. For example, for gaining attention, unexpected communication to students from time to time was used as motivational tactic. For increasing relevance, messages had to link feedback to learner’s work and daily circumstances. For enhancing confidence, messages were designed to make students feel as part of a group who are all struggling to get it done. For achieving satisfaction, turn-around time for assignments was held short. Both, improved retention rates and student comments offered clear support for the motivational messages.

Testing the motivation of students can be done by human tutors, but also computer-based through the realization of “motivational adaptive instruction” what realizes to a higher degree open and self-regulated learning (see an approach from Song & Keller, 2001). Other research based on the ARCS-model in computer-based, but not web-based self-regulated learning environments can be found in Means, Jonassen, and Dwyer (1997) and Shellnut, Knowlton, and Savage (1999).

A Virtual Critical Thinking School

The combination of a linear and an open cognitive load reflecting learning environment together with motivational adaptive instruction build the main pillars of a “virtual critical thinking school” to be developed in the near future (see a prototype in german language at URL http:// www.sbg.ac.at/erz/aaakurs/krit_home.htm).

There is already some research indicating that, especially, these components could be successful when included within a learning environment for promoting critical thinking. Stoney and Oliver (1999) found within a comparable environment significant indicators of critical thinking within student’s learning activities. But, their research was not based on fully developed open and motivational components, had many human regulated parts, and did not realize a systematic test of the relevant components. Also, Raghavan, Sartoris, and Glaser (1998) had similar problems in their MARS (Model Assisted Reasoning in Science)–project representing an OLE for stimulating critical thinking. In this project, many components of traditional instruction (e.g., group discussions) were integrated what makes it impossible to identify the contribution of the web-based components. Finally, also Angeli (1999) implemented an open learning environment for promoting critical thinking together with linear content structures and learning feedback. But, this research is not conclusive for the virtual thinking school, because it was exclusively realized within a non web-based environment.

The Influence of Web-Lectures on Critical Thinking

As many traditional programs for critical thinking are not effective in daily instruction, even on college level (e.g., McMillan, 1987), it might help to use the web and related technologies to educate critical thinkers. But, research on critical thinking showed that using the web as content or as tools did not affect critical thinking significantly (e.g., Scarce, 1997). Similar or inconclusive results were found when collaboration supporting elements of the web (e.g., discussion forums) are used for generating critical thinking environments (e.g., Sloffer, Dueber, & Duffy, 1999). However, positive influences from off-line computer-based, but not on-line web-based scenarios on critical thinking were found when “linear program elements”, like drills or tutorials, were combined with elements of”open learning environments”, like coaching or additional learning materials (e.g., Stenning, Cox, & Oberlander, 1995; Van der Pal & Eysing, 1999; Van Gelder, 2000a).

In web-based scenarios, “web-lectures” represent such learning environments: They combine linear sequences of speech-recordings with overviews and presentations with the possibility to freely navigate through the material. Web-lectures consist, as a rule, of audio or video recordings, which can be combined with synchronous (or asynchronous) (Powerpoint) slides (see Figure 1). Such slides contain information based on the recordings in text or graphical form. The slides presented as learning guidance can be seen as “synchronous organizers”, which represent a combination of “advance” and “post organizers”. They are “advance organizers” when learners use them for preparing learning at the beginning of listening to the recordings, or they are “post organizers” when learners use them to integrate, compare, etc. elements after listening to the recordings. Web lectures represent a promising way of teaching and learning in the near future, because they are a) simple to design and use (designers and users do not need sophisticated skills or tools other than a suitable multimedia PC, web connection, and adding streaming media to web pages), and b) flexible (streamed lectures are available anytime and across the Internet promoting flexible delivery and broad participation).


Although, there are many research studies dealing with learning from audio-video recordings (e.g., Cennamo, 1993), there is little well-founded research proving empirically the effects of such web lectures. For example, Latchman and Kim (1999), Murphy, Dooley, Wickersham, and Parlin (1999), and also Wheeler (2000) discussed technological, methodological, and psychological factors for the successful use of web delivered streaming lectures in distance learning settings based on some qualitative assessments, but do not deliver any quantitative data about the effectiveness of different types of web-lectures.

Experiment 1

As web-lectures can offer different learning environments to students, it is an interesting question which of these environments support the learning of critical thinking best possible. In order to answer this research question, especially four different types of web-lecture environments must be considered: an audio-recording of a lecture with (1) and without synchronous organizers (2), and a video-recording of a lecture with (3) and without synchronous organizers (4). The effects of these environments were tested on subjective evaluations, retention, and transfer as indicators of learning on critical thinking skills.

Subjective evaluations were defined as the learners’ assumptions about the quality of the learning process and the learning result. Retention was defined as the learner’s ability to remember the learning contents and to solve simple classification problems which were closely related to these contents. Transfer was defined as the learner’s ability to solve problems which made it necessary to apply the learning content within new and complex task contexts.

Subjective evaluations were considered because web-lectures realize self-regulated learning in which the learners themselves control the learning activities, and controlling the learning process is mainly based on evaluations of the learning results (e.g., Schunk & Ertmer, 1999). Retention and transfer are typical dependent measures in similar studies (e.g., Moreno & Mayer, 1999).

In respect to subjective evaluations of the learning process and the learning results, it was expected that learners prefer video over audio, because in earlier multimedia studies, it was found that learners like to learn via video more than other media (Tang & Isaacs, 1993). It was further expected, that learners like to have some learning guidance, i.e., the web-lectures with synchronous organizers. Learning guidance with synchronous organizers might reduce problems in selecting, summarizing, or structuring the content presented within the audio or video recordings. It might give learners a feeling of security what should influence subjective evaluations.

Also, retention should be influenced differently from the different types of web-lectures. First, it was expected, that retention could be improved when video recordings in comparison to audio recordings were used. Duquette and Painchaud (1996) found that learners listening to a video-taped dialogue learned more unfamiliar words than learners listing to an audio-taped dialogue. Similar results were found from A1-Seghayer (2001). Among the suggested factors that explain such a result are that video better builds a mental image, better creates curiosity leading to increased concentration, and embodies an advantageous combination of modalities (vidid or dynamic image and sound). Especially, the capability of creating curiosity and concentration is important for retention which is based on the (possibly boring) re-learning of definitions, keywords, etc. Second, it was expected that the use of slides (as synchronous organizers) should improve retention because (together with the audio or video-recording) a rich learning environment is established (with sound, text, tables, pictures, etc.) what should stimulate multiple coding of the learning content and therefore retention (e.g., Chun & Plass, 1997; Robinson, Katayama, & DuBois, 1998).

In respect to transfer, it was postulated, that the audio group will outperform the video group. Critical thinking and especially realizing transfer on critical thinking represents a higher order thinking skill which needs a great amount of cognitive resources. As video consists of an audio information and a dynamic picture, it puts more cognitive load on the learner than an audio recording (without a dynamic picture). Audio information makes it possible to look at notes or slides without disturbance. However, if a learner looks at the video picture, then attention is subtracted from notes or slides and related learning activities. So, there is a “split-attention-effect” within the video-condition what should increase cognitive load and reduce learning transfer (Moreno & Mayer, 2000). Also, the group with synchronous organizers was expected to show better learning transfer than the group without synchronous organizers. These slides have many attributes of transfer stimulating “organizers” (e.g., they are composed of a short set of verbal or visual information which exceeds or falls behind the to-be-learned content), and they do also contain a certain level of redundancy (e.g., content from to-be-learned information), so that sufficient processing requirements for realizing learning transfer are available (Plass, Chun, Mayer, & Leutner, 1998).

“Action control” was measured as covariable and is defined as “motivational process which protects current intentions from being disturbed and substituted by other intentions” (Rheinberg, 1995, p. 163). Action control as personality characteristic was considered in this experiment because learning in web-lectures can result in cognitive overload or distraction and action control is closely related to these phenomena. For example, action-orientated persons in comparison with state-oriented persons try to process information in a more parsimonious way, so that cognitive overload is reduced, what should improve learning. Also, “activity level” was measured as co-variable representing the amount of (external) encoding information throughout the learning process. Encoding should influence learning, because during taking notes, information is elaborated, integrated, compared, etc. It was expected that both variables influence each of the dependent variables because they are active over the entire learning phase.


Participants, Design, and Procedures

Seventy (70) undergraduate students from the University of Erfurt participated in this study. All students were paid for their participation. The sample from different study programs in the field of humanities and social science included 54 women and 16 men with an average age of 22 years. Participants were assigned randomly to the experimental conditions.

The experiment was based on a covariance-analytic 2×2 factorial design. One experimental condition concerned the modality of the lecture recordings (audio vs. video) and the other group the availability of synchronous organizers (with vs. without slides). Students were not familiar with the content of the learning materials. As co-variables, action control and activity level were measured.

Participants had to learn for 60 minutes with one of the four types of web-lectures. Before this learning phase, participants had to accomplish a test measuring the co-variables what took about 10 minutes. Also, they received a short introduction in handling the software for playing the web-lectures (with RealPlayer) for about 3 minutes. The students were instructed and able to start, stop, pause, and navigate the video/audio-stream by using different control buttons and the table/overview of slides. Students were stimulated to take notes during the learning phase, but were also instructed that these notes could not be used during the test phase. After the learning phase, students took the different learning tests. Overall, the experiment lasted for about 100 minutes. After completing all the tests and returning the notes, participants were paid.

Material and Apparatus

For each participant, the paper-and-pencil materials consisted of a questionnaire, a self-evaluation scale, a retention test, and a transfer test. The questionnaire solicited information concerning the participant’s education, study achievements, gender, and Internet knowledge. Internet knowledge was measured by using a 10-item self rating. On the self-rating, students were asked to indicate their Internet activities (Emailing, chatting, etc.) on a 4-point scale ranging from often (4) to never (1). Overall, the students in all groups had good study achievements and a high Internet knowledge. There were no significant correlations of gender, of age, of average study achievements, and of Internet knowledge with the dependent variables (0.22 < r < 0.16, 0.06 < p < 0.98).

The questionnaire also contained a short version of the decision focused subscale of the action control scale, consisting of six items (Kuhl, 1985,p. 125). Reliability analysis showed an acceptable consistency of this scale (Cronbach’s Alpha = 0.65). Also, as co-variable, the amount of notes, measured in written paper pages by the learner was considered. The self-evaluation scale consisted of three items measuring the subjective evaluation of the learning process and results by the participant. The following items were used: (1) “I liked [I did not like] the way I learned”. (2)”In the future, I would like to learn more often in this way [yes, no]”. (3) “This experience showed to me, that it is possible to learn successfully with the Internet [yes, no]”. Reliability analysis showed a high consistency of this scale (Cronbach’s Alpha = 0.89) with a high value indicating a (more) negative self-evaluation.

The retention test consisted of eight items concerning the content of the learning phase, i.e. “non-formal errors in arguments”. The low consistency coefficient (Cronbach’s Alpha = 0.49) showed the heterogeneity of the test items. Here are three examples of the test items: (1) “Alcohol causes liver damage. The explanation is, that alcohol contains liver damaging substances. Irrespective of whether this statement is right or wrong: Is this a vicious circle? [yes, no]”. (2) “A `domino-error’ is given, when [a] somebody assumes a questionable circumstance which leads through several steps to an inevitable conclusion; [b] an argument is criticized by referring to a questionable circumstance; [c] somebody estimates a statement as being true, because somebody else has proven that the negation of the statement is true; [d] somebody attacks a weak argument concerning a circumstance for which there are strong arguments available”. (3) “Only because our children are not reading the bible, they are addicted to drugs. Yes, parents are guilty, because they do not stimulate their children to study the bible, said a preacher. Which type of non-formal error is given? [error of authority, error of ignorance, error of loaded statement]”.

The transfer test consisted of 10 items from the California Critical Thinking Test (Facione & Facione, 1992). The items were randomly selected and showed an acceptable consistency (Cronbach’s Alpha = 0.59).

The web-lecture was produced using the Realpresenter-Software and included an audio- or video-stream of a spoken text together with synchronized MS-Powerpoint-slides and a table of the slides. The resulting SMIL-files were implemented on a Real-Server and were played with the RealPlayerBasic (see URL http:// www.microsoft.com;http:// www.realnetworks.com).


As subjective evaluations were not correlated significantly with the two other dependent variables (0.04 > r > -0.10), analyses and results were divided into two parts.

Effects on subjective evaluations An univariate analysis of covariance (ANCOVA) was conducted for subjective evaluations with groups modality (audio vs. video) and synchronous organizers (with vs. without), together with action-control and activity level as co-variables. Table 2 shows means and standard deviations for each combination of the four experimental groups.

As can be seen, both groups without synchronous organizers had a less positive subjective evaluation of the learning experience than both groups with synchronous organizers. The ANCOVA revealed a significant main effect of synchronous organizers (F(1,61) = 7.025, p < .010, [R.sup.2] = 0.10). There were no statistically significant differences between the two modalities (F(1,61) = 0.469, ns) and no significant interaction effect (F(1,61) = 0.060, ns). Both co-variables were found to have a significant influence on subjective evaluations (action-control: F(1,61) = 4.553, p < 0.05, [R.sup.2] = 0.07 and activity level: F(1,61) = 5.055, p < 0.05, [R.sup.2] = 0.08). Action-orientated persons and persons with a high activity level had more positive evaluations than state-orientated and less active persons (r = 0.24, p < 0.05 for action-control; r = 0.22, p < 0.10 for activity level).

Effects on retention and transfer. As no statistically significant influences from the co-variables on the dependent variables were found (0.18 > r > -0.06) and as retention was correlated with transfer (r = 0.28, p < 0.05), a multi-variate ANOVA based on General Linear Modeling (GLM) was computed. As can be seen on Table 2, retention was statistically significantly influenced neither from modality (F(1,66) = 1.238, ns), nor from synchronous organizers (F(1,66) = 0.339, ns). Results also showed that synchronous organizers did not influence transfer (F(1,66) = 0.206, ns). But, the audio group outperformed the video group significantly (F(1,66) = 8.490, p < 0.001, [R.sup.2] = 0.11). For both dependent variables, there was no significant interaction effect (F(1,66) = 1.049, ns; F(1,66) = 0.006, ns).


The results support only some of the hypotheses. In respect to subjective evaluations, there is evidence, that learners prefer as stated in the hypothesis, synchronous organizers, but not, as expected, video. Students had more positive evaluations of learning when they were confronted with the audio recordings, at least in tendency and not statistically significant. An explanation of this result might be that the video-picture produced in such web-lectures is very small, not sharp, and is sometimes displayed with delays due to bandwidth-problems during Internet transmission.

This problem with the video-picture might also be the reason, why there was no modality effect on retention, although in tendency, the video group showed better results than the audio group. Bad video-pictures are not motivating. Also, synchronous organizers did not influence retention significantly, but in tendency as expected. This result can be explained by the design of the synchronous organizers. They included some redundancies in relation to the to-be-learned information, but they also included some information (comparisons, etc.) that should stimulate transfer. So, the synchronous organizers were designed to both support retention and transfer, what produced in both dependent variables no significant effects, because of lacking specifically designed features.

The missing effect of synchronous organizers on transfer is due to the fact that the organizers were designed as a compromise between supporting retention (with old information) and transfer (with new information). In future studies, it seems necessary to distinguish clearly between this two different ways of designing organizers. In respect to modality, the audio group outperformed the video group, as expected in the split-attention hypothesis. This result is to some degree surprising, because it could be expected that learners within the video group did not look at the video very often because of it’s bad quality.

That action-control only influenced the subjective evaluations, and not retention and transfer is also surprising. The results concerning action control might be related to the availability of note taking during the learning process. Using notes may stop action control (e.g., a parsimonious processing of information) or make it unnecessary so that action control does not affect learning. Another reason might be, that action-control only become relevant for learning when it is activated by instructional features (e.g., hints to control attention) of the learning environments. Such features were not included in the web-lectures used in this study.

In respect to missing effects of the activity level, it would help to analyze not only the quantity of notes, but also their quality (e.g., number of concepts or inferences). The quality of notes should have a closer relationship to learning than the quantity (Zwicker, 1989).

Overall, it seems that the audio-condition with synchronous organizers should guarantee positive self-evaluations and improved learning transfer. But, there are no data available how this favorable type of web lecture influence critical thinking in comparison with traditional instruction. In order to improve the research situation, experiment 2 was conducted. Experiment 2 will not be conducted to test open questions resulting from experiment 1. However, the purpose of experiment 2 is to compare a certain type of web-lecture which was found, in experiment 1 to influence learning positively with traditional instruction. As traditional instruction, the confrontation with learning texts was selected, because learning with texts is also a highly self-regulated activity like learning with web-lectures. As action control and activity level had shown no effects on retention and on transfer in experiment 1, they were excluded in experiment 2. As subjective evaluations were not correlated with retention and transfer in experiment 1, they were not considered within experiment 2. As no significant effects on retention were found in experiment 1, retention was seen as critical variable. In order to prove possible missing or negative effects of web-lectures on retention, retention was selected as dependent variable in experiment 2.

Experiment 2

Recent studies in media comparison research showed that no statistically significant differences in learning effects were found when comparing web-based with traditional instruction (e.g., Smeaton & Keogh, 1999). This result was found for different types of learning outcomes (e.g., declarative and procedural knowledge acquisition). In respect to web-lectures, Ingebritsen and Flickinger (1999) compared the effects of audio-streams of lectures together with computer-mediated-communication facilities (like Email, Chat, and a discussion forum) with traditional lectures and found no statistically significant effects on learning results of several hundreds of high school and college students. Similar results were found by LaRose, Gregg, and Eastin (1998) in respect to total test scores and also students attitudes when comparing within a controlled experiment audio-streams with live lectures. Shim (2000) found that audio web-lectures increased the understanding of the subject, but should not replace traditional class lectures in view of graduate university students. Based on given research, it was expected that the two conditions (audio-lecture with synchronous organizers vs. traditional text-based learning) do not have different effects on retention of critical thinking.



23 undergraduates at the University of Erfurt participated in this experiment for credit as part of a course on the quality of web-based instruction included within a general study program (Studium Fundamentale). The sample included 17 women and 6 men. The average age of students was 21 years. Participants were assigned randomly to condition.


One group (n=14) received an eight pages paper text about results and problems of research on web-based instruction. The other group (n=9) received the same content, but included within a web-lecture (consisting of audio-streams with synchronous organizers) with a duration of about 37 minutes.

In both groups, students were instructed to learn about the material as much as they can within a 60-minutes period. All students were stimulated to take notes during learning, but were not allowed to use them during examination. Students were not familiar with the content of the material, although they were confronted with basic concepts in former course sessions. Within the examination, all students were presented 13 multiple-choice questions about the material. 12 out of 13 questions ranged in their difficulty between 20 to 80 percent and were taken for computing the overall test results. The average difficulty of the questions was 0.48. As the questions concern different aspects of critical thinking, test homogeneity was low (Cronbach’s Alpha = 0.30).

The test contained, for example, the following two questions and answer alternatives: ( 1 ) “Learning in web-based courses is as effective as traditional courses. [a] This statement is not true, because the two learning environments cannot be compared adequately. [b] This statement is not true, because in some studies web-based courses outperform traditional courses. [c] This statement shows the average results of many studies. [d] This statement is not true, because there are some studies showing that traditional courses outperform web-based courses”. (2) “Personality characteristics contribute little in explaining the learning effects found in web-based courses, because [a] the shown learning behavior is more important; [b] recent studies are not founded theoretically; [c] emotional and motivational personality characteristics are overloaded; and [d] the web is a `cold technology’ without the possibility to adapt to personal needs”.

Overall, within this test, students needed to question research findings in respect to causality, statistical reasoning, or representativity. This shows that mainly one aspect of critical thinking was tested, namely scientific analytic reasoning. The material was designed to stimulate this kind of critical thinking and also showed the learner, how to criticize research, but the material was not a systematic and comprehensive instructional program concerning scientific analytic reasoning.


As number of students in groups was small and different, a non-parametric test (Mann-Whitney U-Test) was computed for comparing test results. Results showed that the two groups did not differ significantly in their retention test achievements (Z = -0.74, p < 0.46). Descriptive statistics showed that the web-lecture group slightly, but not statistically relevant, outperformed the paper group (see Table 3).

Also, an analysis of all single test questions showed that in no case one group did a better job than the other (Z-values ranged from -1.42 to -0.04, p-values from 0.16 to 0.97). There were also no significant influences of gender (F(1,11) = 1.49; p < 0.25), of age (F(1,11)= 0.18, p < 0.68), and of average study achievements (F(1,11 ) = 0.12, p < 0.74) on test results.


As an explanation of the results found within experiment 2 can be given, that learning with instructional web-lectures can produce equal learning effects than learning with paper texts. It might be the case, that high experience in learning with text material can be compensated with instructional elements (spoken test, MS-Powerpoint slides, table) found in the web-lecture. Therefore, it is more obvious that not the two different media (paper, web-lecture) were compared, but two different instructional packages. One package consisted of learners who were confronted with a paper text and these learners were highly experienced in learning with such a material in general, because they were university students and disposed of effective strategies for learning with texts. The other package consisted of less experienced learners, because they were not familiar with Internet-based audiostreams. But, the second instructional package consisted of additional instructional elements not found within the paper text (e.g., tables of content, or organizers). In that sense, it can be seen as a success that the less experienced web-lecture group achieved equal results compared with the well experienced text-learning group. However, it is an open question for future research, whether well experienced web-lecture learners can outperform well experienced text learners.

General Discussions: Cognitive and Motivational Principles of Multimedia Learning

It has to be stated, that experiment 1 and experiment 2 suffer from some problems (i.e., the small number of subjects, especially in experiment 2; only few standardized tests were used; the short learning time might produce novelty effects; or more women than men were included within the samples), which have to be solved in future studies. But the main problem is a theoretical one. Some of the hypothesis were founded with findings from other empirical studies, but not with a comprehensive theory of learning or critical thinking with web-lectures.

To some extent, web-lectures realize a multimedia environment, in which audio or video information is combined with text, figures, tables, etc. For such environments, Mayer (2001) presented a comprehensive theory about cognitive principles of multimedia learning based on the following assumptions: (a) working memory includes auditory and visual components; (b) each working memory has limited capacity resulting in the problem of cognitive overload; (c) humans have separate systems for verbal and non-verbal information (and they are dual-coders); and (d) learning occurs when information is selected and organized in each store and connections between stores are given. From these assumptions different principles are deduced, which have some general relevance for web-lectures: (a) split-attention principle (students learn better, when they do not split their attention between multiple sources of information); (b) modality principle (students learn better, when verbal information is presented auditorily rather than visually as on-screen text); (c) spatial contiguity principle (students learn better when on-screen text and visual materials are physically integrated rather than separated); or (d) temporal contiguity principle (students learn better, when verbal and visual materials are temporally synchronized rather than separated in time). But, this line of research and theory is not fully relevant for web-lectures, because there are some differences between web-lectures and multimedia learning environments: (a) there are, as a rule, no animations of content related processes in web-lectures; but animations were the most important learning contexts for the research from Mayer (2001); (b) there are content-tables permanently available and Email-links integrated in weblectures which are seldom features of multimedia learning environments; (c) within weblectures, contents are presented in a linear manner, whereas contents within multimedia learning environments can be presented in non-linear manner; (d) there are many other variables influencing effort and learning in addition to words and pictures (e.g., the learners’ prior knowledge, the importance of the information, the style of writing, embedded questions, text syntax, or the presence of objectives), and (e) some of Mayer’s (2001) assumptions are closely related to language learning which is relatively irrelevant for critical thinking.

Another problem with the approach from Mayer (2001) comes from the fact, that multimedia learning theory does not consider motivational and emotional aspects. Some elements of a multimedia or a web-lecture environment can also have a noncognitive quality. Although, prominent researchers postulated in the early eighties that mental effort or cognitive load has motivational as well as cognitive components, this perspective has not yet been considered sufficiently within multimedia research (e.g., Salomon, 1983). For example, video information is evaluated as having a greater motivational value than audio information, because it integrates appealing dynamic pictures, colors, etc. Also, synchronous organizers have a motivational and emotional quality, because they give learning support and help therefore to reduce fear-of-failure. Some researchers go so far to say that the amount of cognitive effort or cognitive load expended is an appropriate index of motivation as it relies on the learner’s focusing on mastering the learning task and maintaining a high sense of personal efficacy (Stoney & Oliver, 1999).

Approaches from Keller (1997), Lee and Boling (1999), or Astleitner and Leutner (2000) are dealing with the motivational/ emotional value of multimedia instructional elements. These motivational/emotional elements are important, because (a) motivation/emotion are influencing learning significantly; (b) motivational and emotional processes do need memory resources and therefore in- or decrease cognitive load, and especially action-control; and (c) there is a more or less direct connection between cognitive and motivational/emotional variables: especially attention represents an important element, both for cognitively and for motivationally driven models of learning and memory usage.

A possible solution to these problems might be to find and use a theory which integrates cognitive and motivational/emotional aspects of memory usage and learning. One possibly relevant approach with close relationship to Mayer’s (2001) basic assumptions concerning multimedia learning comes from Kuhl’s (1985) action-control theory, from self-regulation theory by Zimmerman (1998), or from sell:management theory by Corno and Randi (1999). Based on such theories, web-based lectures have to be designed, tested, and optimized, especially in respect to different personality characteristics and individual needs.

Table 1

An Empirical Taxonomy of Critical Thinking (Dick, 1991, p.84)

Identifying arguments

Themes, conclusions, reasons, organization

Analyzing arguments

Assumptions, vagueness, omissions

Considering external influences

Values, authority, emotional language

Scientific analytic reasoning

Causality, statistical reasoning, representativity

Reasoning and logic

Analogy, deduction, induction

Table 2

Effects of Modality and Synchronous Organizers on Subjective

Evaluations. Retention, and Transfers of Critical Thinking

in Experiment 1

Measures Effects Audio

With Organizers

A (n = 16)

Mean Stddev

Evaluations A+C vs. B+D 3.41 * 1.06

Retention ns 4.50 1.93

Transfer A+B vs. C+D 4.38 2.13

Measures Audio Video

Without With Without

Organizers Organizers Organizers

B (n = 19) C (n = 15) D (n = 20)

Mean Stddev Mean Stddev Mean Stddev

Evaluations 4.10 * 1.16 3.53 * 1.12 4.38 * 1.46

Retention 3.90 1.60 4.53 1.69 4.70 1.03

Transfer 4.21 1.87 3.13 1.06 2.90 1.94


Note. * Adjusted means for co-variables (action control and

activity level)

Table 3

Effect of Media Type on Critical Thinking in Experiment 2

Group n Mean Stddev Sum of Ranks

Paper 14 5.07 2.02 156.50

Audio-Web-Lecture 9 5.78 1.64 119.50


Al-Seghayer, K. (2001). The effects of multimedia annotation modes on L2 vocabulary acquisition. A comparative study. Language Learning & Technology, 5, 202-232.

Angeli, C. M. (1999). Examining the effects of context-free and context-situated instructional strategies on learners’ critical thinking (Unpublished doctoral dissertation, Indiana University, USA).

APA (American Philosophical Association). (1990). Critical thinking. A statement of expert consensus for’ purposes of educational assessment and instruction, recommendations prepared for the committee on pre-college philosophy. ERIC-Paper, No. ED 315-423.

Arburn, T. M., & Bethel, L. J. (1999). Assisting at-risk community college students. Acquisition of critical thinking learning strategies. Paper presented at the Annual Conference of the National Association for Research in Science Teaching, Boston, 28.-31.3.1999.

Astleitner, H. (1998). Kritisches Denken. Basisqualifikation fur Lehrer und Ausbilder. Innsbruck: Studienverlag.

Astleitner, H. (2000a). Kritisches Denken im Unterricht. Padagogisches Handeln, 4, 39-50.

Astleitner, H. (2000b). Qualitat yon webbasierter Instruktion: Was wissen wir aus der experimentellen Forschung? In F. Scheuermann (Hrsg.), Campus 2000. Lernen in neuen Organisationsformen (S. 15-39). Munster: Waxmann.

Astleitner, H., & Leutner, D. (2000). Designing instructional technology from an emotional perspective. Journal of Research on Computing in Education, 32,497-510.

Beyer, B. K. (1990). What philosophy offers to the teaching of thinking. Educational Leadership, 47, 55-60.

Bullen, M. (1998). Participation and critical thinking in online university distance education [WWW document]. URL http:// cade.athabascau.ca/vol13.2/bullen.html

Carroll, J. M. (Ed.). (1998). Minimalism beyond the Nurnberg funnel. Cambridge, MA: MIT Press.

Cennamo, K. S. (1993). Learning from video. Factors influencing learners’ preconceptions and invested mental effort. Educational Technology, Research and Development, 41, 33-45.

Chater, N., & Oaksford, M. (1999). The probability heuristics model of syllogistic reasoning. Cognitive Psychology, 38, 191-258.

Cheng, P. W., & Holyoak, K. J. (1985). Pragmatic reasoning schemas. Cognitive Psychology, 17, 391-416.

Cheng, P. W., Holyoak, K. J., Nisbett, R. E., & Oliver, R. M. (1986). Pragmatic versus syntactic approaches to training deductive reasoning. Cognitive Psychology, 18, 293-328.

Chun, D. M., & Plass, J. L. (1997). Research on text comprehension in multimedia environments. Language Learning & Technology, 1, 60-81.

Chyung, Y., Winiecki, D. & Fenner, J. A. (1998). A case study. Increase enrollment by reducing dropout rates in adult education. Paper presented at the 14th Annual Conference on Distance Education & Learning. Madison, WI, 5.8-7.8.1998.

Clark, J. H., & Biddle, A.W. (1993). Teaching critical thinking. Englewood Cliffs, NJ: Prentice Hall.

Cooper, G. (1998). Research into cognitive load theory and instructional design [WWW document]. URL http://www.arts.unsw.edu.au/ edu cation/CLT_NET_Aug_97.HTML

Corno, L., & Randi, J. (1999). A design theory for classroom instruction in self-regulated learning? In C. M. Reigeluth (Ed.), Instructional-design theories and models. A new paradigm of instructional theory (pp. 293-318). Mahwah, NJ: Erlbaum.

De Jong, T., & Van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68, 179-201.

DeLoach, S. B., & Greenlaw, S. A. (1999). Critical thinking and electronic discussions in upper-level economic courses [WWW document]. URL paper-eea99a.html

Dick, R. D. (1991). An empirical taxonomy of critical thinking. Journal of Instructional Psychology, 18, 79-92.

Dillon, A., & Gabbard, R. (1998). Hypermedia as an educational technology. A review of the quantitative research literature on learner comprehension, control, and style. Review of Educational Research, 68, 322-349.

Duffelmeyer, B. B. (2000). Critical computer literacy. Computers in first-year composition as topic and environment [WWW document]. URL http://corax.cwrl.utexas.edu/cac/ current_issue/duffelmeyer.html

Duquette, L., & Painchaud, G. (1996). A comparison of vocabulary acquisition in audio and video contexts. The Canadian Modern Language Review, 54, 143-172.

Ennis, R.H. (1962). A concept of critical thinking. Harvard Educational Review, 32, 81-111.

Facione, N. C., & Facione, P. A. (1992). The California Critical Skills Test: Form A and B. Test manual. Millbrae, CA: California Academic Press.

Facione, P. A., & Facione, N. C. (1994). Holistic critical thinking scoring rubric. Millbrae, CA: California Academic Press.

Facione, P. A., Facione, N. C., & Giancarlo, C. A. (1992). The California Critical Thinking Disposition Inventory: Test Manual. Millbrae, CA: California Academic Press.

Faryniarz, J. V., & Lockwood, L. G. (1992). Effectiveness of microcomputer simulations in stimulating environmental problem solving by community college students. Journal of Research in Science Teaching, 29, 453-470.

Frisby, C. L. (1992). Construct validity and psychometric properties of the Cornell Critical Thinking Test (Level Z). A contrasted groups analysis. Pschological Reports, 71,291-303.

Galloti, K. M. (1989). Approaches to studying formal and everyday reasoning. Psychological Bulletin, 105, 331-351.

Gigerenzer, G. (2000). Adaptive thinking. Rationality in the real world. Oxford: University Press.

Gigerenzer, G., & Hoffrage, U. (1995). How to improve Bayesian reasoning without instruction. Frequency format. Psychological Review, 102,684-704.

Gilster, P. (1997). Digital literacy. The thinking and survival skills new users need to make the internet personally and professionally meaningful. New York: Wiley.

Glebas, G. J. (1997). Evaluating the effectiveness of using the spreadsheet application as a cognitive tool to increase mathematics achievement [WWW document]. URL http:// home.att.net/~sabelg/thesis.html

Gokhale, A. A. (1995). Collaborative learning enhances critical thinking. Journal of Technology Education, 7, 22-30.

Gokhale, A. A. (1996). Effectiveness of computer simulations for enhancing higher order thinking. Journal of Industrial Teacher Education, 33, 36-46.

Hager, W. (Hrsg.). (1995). Programme zur Forderung des Denkens bei Kindern. Gottingen: Hogrefe.

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains. American Psychologist, 53,449-455.

Hannafin, M., Land, S., & Oliver, K. (1999). Open learning environments. Foundations, methods, and models. In C. M. Reigeluth (Ed.), Instructional-design theories and models. A new paradigm of instructional theory (pp. 115-140). Mahwah, NJ: Erlbaum.

Ingebritsen, T. S., & Flickinger, K. (1999). Development and assessment of web-courses that use streaming audio and video technologies [WWW document]. URL http://project.bio.iastate.edu/ workshops/DL99/DL_98_Manuscript.htm

Jonassen, D. H. (1996). Computers in the Classroom. Mindtools for critical thinking. Englewood Cliffs: Prentice Hall.

Josephson, J. R., & Josephson, S. G. (1994). Abductive Inference. Computation, Philosophy, Technology. Cambridge, UK: Cambridge University Press.

Keller, J. M. (1983). Motivational design of instruction. In C. M. Reigeluth (Ed.), Instructional-design theories and models. An overview of their current status (pp. 383-434). Hillsdale, NJ: Erlbaum.

Keller, J. M. (1997). Motivational design and multimedia. Beyond the novelty effect. Strategic Human Resource Development Review, 1, 188-203.

Keller, J. M. (1999). Motivation in cyber learning environments. International Journal of Educational Technology, 1, 7-30.

Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Science, 12, 1-48.

Klauer, K. J. (Hrsg.). (1993). Kognitives Training. Gottingen: Hogrefe.

Kraak, B. (2000). Erziehung zum kritischen Denken–Eine wichtige–vielleicht die wichtigste Bildungsaufgabe der Gegenwart. Padagogisches Handeln, 4, 51-70.

Kuhl, J. (1985). Volitional mediators of cognition-behavior consistency. Self-regulatory processes and action versus state orientation. In J. Kuhl & J. Beckmann (Eds.), Action control. From cognition to behavior (pp. 101-128). Berlin: Springer.

Lang, H. R., McBeath, A., & Hebert, J. (1995). Teaching. Strategies and methods for student-centered instruction. Toronto: Harcourt Brace.

LaRose, R., Gregg, J., & Eastin, M. (1998). Audiographic telecourses for the web. An experiment. Journal of Computer-Mediated-Communication, 4 [WWW document]. URL http:// www.ascusc.org/jcmc/vol4/issue2/larose.html

Latchman, H., & Kim, J. (1999). Streaming audio and video in ALNs [WWW document]. URL http://csc.list.ufl.edu/~kjm/aln.html

LeBlanc, J. (1998). Thinking clearly. New York: Norton.

Lee, S. H., & Boling, E. (1999). Screen design guidelines for motivation in interactive multimedia instruction. A survey and framework for designers. Educational Technology, 39, 19-26.

Maiorana, V. P. (1992). Critical thinking across the curriculum. Building the analytical classroom. Bloomington, IN: Eric.

Mancall, J. C., Aaron, S. L., & Walker, S. A. (1986). Educating students to think. The role of the school library media program. School Library Media Quarterly, 15, 18-27.

Mandl, H., & Friedrich, H. F. (Hrsg.). (1992). Lern- und Denkstrategien. Analyse und Intervention. Gottingen: Hogrefe.

Mayer, R. E. (2001). Multimedia learning. Cambridge, UK: University Press.

McKenzie, C. (1994). The accuracy of intuitive judgment strategies. Covariation assessment and Bayesian inference. Cognitive Psychology, 26,209-239.

McLoughlin, C., & Luca, J. (2000). Cognitive engagement and higher order thinking through computer conferencing. We know why but do we know how? [WWW document]. URL http://cleo.murdoch.edu.au/confs/tlf/tlf2000/ mcloughlin.html

McMillan, J. H. (1987). Enhancing college students’ critical thinking. A review of studies. Research in Higher Education, 29, 3-29.

Means, T. B., Jonassen, D. H., & Dwyer, F. M. (1997). Enhancing relevance. Embedded ARCS strategies vs. purpose. Educational Technology, Research & Development, 45, 5-17.

Moreno, R., & Mayer, R. E. (1999). Cognitive principles of multimedia learning. The role of modality and contiguity. Journal of Educational Psychology, 91,358-368.

Moreno, R., & Mayer, R. E. (2000). A learner-centered approach to multimedia explanations. Deriving instructional design principles from cognitive theory. Interactive Multimedia Electronic Journal of Computer-Enhanced Learning, 2 [WWW document]. URL http:// imej.wfu.edu

Muilenburg, L., & Berge, Z. L. (2000). A framework for designing questions for online learning [WWW document]. URL http:// www.emoderators.com/moderators/ muilenburg.html

Murphy, T. H., Dooley, K. E., Wickersham, L., & Parlin, J. (1999). Streaming media as an instructional delivery strategy [WWW documents]. URL http://agnews.tamu.edu/saas/ murph.htm

Newman, D. R., Johnson, C., Cochrane, C., & Webb, B. (1996). An experiment in group learning technology. Evaluating critical thinking in face-to-face and computer-supported seminars. Interpersonal Computing and Technology, 4, 57-74.

Nickerson, R. S., Perkins, D. N., & Smith, E. E. (1985). The teaching of thinking. Hillsdale, NJ: Erlbaum.

Pascarella, E. T., & Terenzini, P. T. (1991). How college affects students. Findings and insights from twenty years of research. San Francisco: Jossey-Bass.

Patry, J.-L. (1996). Qualitat des Unterrichts als Komponente von Schulqualitat. In W. Specht & J. Thonhauser (Hrsg.), Schulqualitat (pp. 58-94). Innsbruck, Wien: Studienvelag.

Paul, R., Binker, A. J. A., Jensen, K., & Kreklau, H. (1990). Critical thinking handbook. 4th-6th grades. Sonoma State University, Rohnert Park, CA: Foundation for Critical Thinking.

Petri, G. (2000). Wie kann kritisches Denken wirksam geschult werden? Ein Modellprojekt praxisorientierter wissenschaftlicher Schulentwicklung. Innsbruck: Studienverlag.

Plass, J. L., Chun, D. M., Mayer, R.E., & Leutner, D. (1998). Supporting visual and verbal learning preferences in a second language multimedia learning environment. Journal of Educational Psychology, 90, 25-36.

Raghavan, K., Sartoris, M., & Glaser, R. (1998). The impact of the MARS curriculum. The mass unit. Science Education, 82, 53-91.

Reimann, P., & Bosnjak, M. (1998). Supporting hypertext-based argumentation skills [WWW document]. URL http:// www.or.zuma.mannheim.de/bosnjak/publications/edmedia98/default.htm

Reinmann-Rothmeier, G., & Mandl, H. (1998). Wissensmanagement. Eine Delphi-Studie (Forschungsbericht Nr. 90). Munchen: Ludwig-Maximilians-Universitat, Lehrstuhl fur Empirische Padagogik und Padagogische Psychologie.

Rheinberg, F. (1995).Motivation. Stuttgart: Kohlhammer.

Rivers, R. H., & Vockell, E. (1987). Computer simulations to stimulate scientific problem solving. Journal of Research in Science Teaching, 24,403-415.

Robinson, D. H., Katayama, A. D., & DuBois, N. F. (1998). Interactive effects of graphic organizers and delayed review on concept application. The Journal of Experimental Education, 67, 17-31.

Sa, W. C., West, R. F., & Stanovich, K. E. (1999). The domain specificity and generality of belief bias. Searching for a generalizable critical thinking skill. Journal of Educational Psychology, 91,497-510.

Saba, F. (2000). Research in distance education. A status report. International Review of Research in Open and Distance Learning, 1 [WWW document]. URL http://www.irrodl.org

Salomon, G. (1983). The differential investment of mental effort in learning from differ ent sources. Educational Psychologist, 18, 42-50.

Santos, L. M., & De Oliveira, M. (1999). Internet as a freeway to foster critical thinking in lab-act#ivies [WWW document]. URL http:// www.narst.org/conference/santosdeoliveira/ santosdeoliveira.htm

Scarce, R. (1997). Using electronic mail discussions groups to enhance students’ critical thinking skills [WWW document]. URL http:// horizon.unc.edu/TS/

Schunk, D. H., & Ertmer, P. A. (1999). Self-regulatory processes during computer skill acquisition. Goal and self-evaluative influences. Journal of Educational Psychology, 91, 251-260.

Shapley, P. (2000). On-line education to develop complex reasoning skills in organic chemistry [WWW document]. URL http:// www.aln.org/alnweb/journal/vol4_issue2/le/ shapley/le-shapley.htm

Shaw, V. (1996). The cognitive processes in informal reasoning. Thinking & Reasoning, 2, 1-104.

Shellnut, B., Knowlton, A., & Savage, T. (1999). Applying the ARCS model to the design and development of computer-based modules for manufacturing engineering courses. Educational Technology, Research and Development, 47, 100-110.

Shim, J. P. (2000). SMIL and videostreaming for teaching business telecommunications and e-commerce. Decision Line, 7, 6-8.

Sloffer, S. J., Dueber, B., & Duffy, T. M. (1999). Using asynchronous conferencing to promote critical thinking. Two implications in higher education (CRLT Technical Report No. 8-99). Bloomington, IN: Center for Research on Learning and Technology, Indiana University.

Smeaton, A. F., & Keogh, G. (1999). An analysis of the use of virtual delivery of undergraduate lectures. Computers & Education, 32, 83-94.

Song, S. H., & Keller, J. M. (2001). Effectiveness of motivationally-adaptive computer-assisted instruction on motivation and learning (submitted for publication).

Stenning, K., Cox, R., & Oberlander, J. (1995). Contrasting the cognitive effects of graphical and sentential logic teaching. Reasoning, representation, and individual differences. Language and Cognitive Processes, 10, 333 354.

Stoney, S., & Oliver, R. (1999). Can higher order thinking and cognitive engagement be enhanced with multimedia? Interactive Multimedia Electronic Journal of Computer-Enhanced Learning, 2 [WWW document]. URL http:// imej.wfu.edu

Sweller, J. (1994). Cognitive load theory, learning difficulty, and instructional design. Learning and Instruction, 4, 295-312.

Tang, J. C., & Isaacs, E. (1993). Why do users like video? Studies of multimedia-supported collaboration. Computer-Supported Cooperative Work: An International Journal, 1, 163-196.

Van der Pal, J., & Eysink, T. (1999). Balancing situativity and formality. The importance of relating a formal language to interactive graphics in logic instruction. Learning and Instruction, 9, 327-341.

Van Gelder, T. (2000a). Learning to reason: A Reason!-Able approach [WWW Document]. URL http://www.philosophy.unimelb.edu.au/reason/

Van Gelder, T. (2000b). The efficacy of undergraduate critical thinking courses [WWW Document]. URL http:// www.philosophy.unimelb.edu.au/reason/

Visser, L. (1998). The development of motivational communication in distance education support (Unpublished doctoral dissertation, University of Twente, The Netherlands).

Walton, D. N. (1989). Informal logic. A handbook for critical argumentation. Cambridge, UK: University Press.

Wheeler, S. (2000). Streaming media and webcasting. Evaluation of an interactive distance learning application [WWW document]. URL http://rilw.emp.paed.uni-muenchen.de/ 2000/papers/wheeler_paper.html

Wilkinson, G. L., Bennett, L. T., & Oliver, K. M. (1997). Evaluation criteria and indicators of quality for Internet resources. Educational Technology, 37, 52-59.

Yeh, Y.-C., & Strang, H. R. (1997). The impact of a computer simulation on critical-thinking instruction [WWW document]. URL http://www.coe.uh.edu/insite/elec_pub/ HTML 1997/si_yeh. htm

Yeh, Y.-C., & Wu, J.-J. (1992). The relationship between critical thinking and academic achievements among elementary and secondary school students. Journal of Education and Psychology, 15, 79-100.

Zimmerman, B. J. (1998). Developing self-fulfilling cycles of academic regulation. An analysis of exemplary instructional models. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning. From teaching to self-reflective practice (pp. 1-19). New York: Guilford.

Zwicker, T. (1989). Die Beeinflussung studentischer Verarbeitungsprozesse durch Verstandlichkeitsmerkmale von Vorlesungsvortragen (Unpublished master thesis, University of Salzburg, Austria).

Hermann Astleitner, Professor, Department of Educational Research, University of Salzburg.

Correspondence concerning this article should be addressed to Dr. Harmann Astleitner, professor, Department of Educational Research, University of Salzburg. Akademiestrasse 26, A-5020 Salzburg, Austria, Europe. Email: Hermann.Astleitner@Sbg.Ac.At

COPYRIGHT 2002 George Uhlig Publisher

COPYRIGHT 2002 Gale Group