Using metacognitive skills to improve 3rd graders’ math problem solving
Patricia D. Goldberg
For the past twenty years, problem solving has been touted as a primary focus for mathematics instruction at all grade levels (National Council of Supervisors of Mathematics, 1978; National Council of Teachers of Mathematics, 1980, 1989, 2000; Mathematical Sciences Education Board, 1989). Despite this persistent call for problem-solving as an instructional approach, teachers across the United States struggle with helping children solve problems. National and international assessments consistently reveal that U.S. students do not perform well on problem-solving tasks that require more than one step (DosKey, 1993; Dossey, 1994; Kenney and Silver, 1997; Silver, 1998; Stigler and Hiebert, 1997). Furthermore, studies of mathematics teachers at all grade levels indicate that teachers have difficulty in planning and implementing lessons that build students’ problem-solving skills. (Cooney, 1985; Silver, 1985; Thompson, 1989) Clearly, teachers need instructional strategies that help students become better problem solvers.
The literature on the teaching of problem solving over the past twenty years promotes building students’ metacognitive skills–planning, monitoring, and evaluating ones own thinking–as a means to improve their problem-solving skills (Costa, 1991; Perkins, 1992, 1995; Fogarty, 1994; Marzano et al, 1997; Swartz and Parks, 1994; Tishman, Perkins, and Jay, 1995). As adults, we see some evidence of children planning, monitoring, and evaluating. Parents listen as their children explain what candy they will select at the movie theatre. Teachers hear children stop what they are doing in a group and tell another child he is not doing the assignment correctly. A grandparent may hear a grandchild comment confidently about the quality of a drawing he insists be displayed on the refrigerator. When given a novel task in school, however, children are very likely to jump into the problem with one strategy, continue the strategy without “looking back,” and finish without reexamining the solution. Often, the result can be a misunderstood problem, or an ineffective strategy, and/or a solution that does not work. According to Perkins (1995), metacognition, or reflective intelligence as he calls it, “particularly supports coping with novelty” (p. 112). Perkins also suggests that reflective intelligence “supports thinking contrary to certain natural trends” (p. 113), thus contributing to breaking mental sets and exploring new ideas.
If students across the United States are to become proficient mathematical problem solvers, teachers at all grade levels must learn how to develop and assess metacognitive skills in their students. Teachers are searching for instructional strategies that will help students plan, monitor, and evaluate their own thinking during problem solving. This study seeks to determine the effectiveness of a set of year-long instructional strategies designed to improve metacognitive skills, thereby improving the problem-solving skills of eight- and nine-year old students.
Metacognition emerged as an important mental activity for solving problems when researchers began to study children’s intelligence and problem solving. Early analyses of problem-solving performance revealed that good or expert problem solvers tended to plan, monitor, and evaluate their thinking during problem solving more often and more efficiently than did poor or novice problem solvers (Flavell, 1976). More recent studies have confirmed this finding for students at middle school, high school, and college levels (Bookman, 1993; Cai, 1994; Lucangeli, Coi, and Bosco, 1997). Flavell (1976) defined metacognition broadly as “one’s knowledge concerning one’s own cognitive processes and products … [and] the active monitoring and consequent regulation and orchestration of these processes” (p. 232).
Based on this definition, Garofalo and Lester (1985) identified three types of metacognitive knowledge related to mathematical problem solving: person knowledge; task knowledge; and strategy knowledge. Mathematical person knowledge includes “one’s assessment of one’s own capabilities and limitations” (p. 167). Mathematical task knowledge includes “one’s beliefs about the subject of mathematics as well as beliefs about the nature of mathematical tasks” (p. 167). “Mathematical strategy knowledge naturally includes knowledge of algorithms and heuristics, but it also includes a person’s awareness of strategies to aid comprehending problem statements, organizing information or data, planning solution attempts, executing plans, and checking results” (p. 168). The focus of this study is almost exclusively on the mathematical strategy knowledge component of metacognitive knowledge.
Research on students’ mathematical strategy knowledge reveals that students at all levels–from elementary school through college–have difficulty planning, organizing, and evaluating their cognitive processes (Horak, 1990; Fortunato et al, 1991; Lester and Garofalo, 1982; Schoenfeld, 1981; Stillman and Galbraith, 1998). As a result of these findings, instruction in problem solving has tended to move away from teaching explicit strategies and heuristics alone and toward helping students develop metacognitive skills (Fernandez et al, 1994; Garofalo, 1986; Garofalo, 1987; Gray, 1991). Research on the effectiveness of teaching metacognitive skills in conjunction with mathematics problem solving, however, has tended to focus only on older students and special populations (Cardelle Elvar, 1995; Charles and Lester, 1984; Montague, 1992; Magsud; 1998).
The purposes of this study were to (1) explore mathematical problem-solving and metacognitive skills as developed by third-grade students prior to instruction in metacognition, and (2) examine the impact of year-long mathematics instruction that centers explicitly on the development of metacognition on the students’ growth in metacognitive and problem-solving skills.
The research methodology employed in this study can be classified as a year-long teaching experiment in which the primary researcher was the teacher of a class that focused on developing metacognitive skills. The teacher conducted the study in collaboration with a mathematics educator from a local university. This classroom-based research can help answer important instructional questions for teachers (Hiebert, 1999). It can focus on questions that are of concern to both teachers and researchers. The drawback to this methodology, however, is that results are often confounded by many extraneous factors found in classrooms. Because the research is situated in two classrooms in a single school, the degree to which teachers and researchers find the results useful is based on similarity of contexts. Every attempt will be made in this report to be clear about confounding influences that might have occurred during the study.
The subjects were the entire third-grade population, two classes of eight- and nine-year-olds, in a rural school in Kentucky. One class, designated as Metacognitive, was experimental; the other class, designated as Non-Metacognitive, represented a control group. Each class had 26 students at the beginning of the school year. Due to student withdrawals and incorrect taping during the data gathering, there were 21 students in the Metacognitive class and 23 students in the Non-Metacognitive class at the end of the school year. Teachers from the previous year used a stratified random strategy to create a balance of achievement levels, gender, and special needs in each class. Table 1 shows the characteristics of each class at the beginning of the year according to gender, age, achievement, and special needs.
The two third-grade teachers varied greatly in background and experiences. The Metacognitive teacher, one of the authors, received her doctorate in teacher education and reading/language arts from a regional university. During her 12 years of teaching, she participated in several mathematics professional development programs, as well as mathematics portfolio training through the state department of education. Her students piloted mathematics portfolios for state assessment. The teacher of the Non-Metacognitive class completed a masters degree in education. During her 19 years of teaching, she received some mathematics training from the school district on a particular program. Based on analysis of the teacher plan books, both teachers scheduled 60 minutes a day for mathematics, used similar materials, and addressed the NCTM Standards. The teacher of the Metacognitive class used more manipulatives and emphasized problem solving and reasoning more often.
The two teachers taught all core subjects in their respective classes. Itinerant teachers, however, taught art, music, physical education, and speech. The school also had a part time gifted/talented teacher who worked often with selected students in grades four and five, and occasionally with all students in both classes.
Two geometry tasks of comparable difficulty were developed as pre- and post-tasks respectively. The tasks, visual in nature, required students to organize their thinking, develop a plan, and obtain a solution. These visual tasks were chosen to ensure that specific mathematics content knowledge (e.g., fractions, polygons, graphs) did not affect performance on the task. To the best of our knowledge, the tasks were novel problems that were not used in classes prior to the study nor in either class during the study. The pre-task was administered to all students in October; the post-task was administered to all students in May. Table 2 presents the tasks as they were provided to students:
Proctors were trained to present the tasks to each student individually. Prior to presenting the task, proctors asked students to “think aloud” while working on the task. The proctors presented tasks to students and listened to their comments. If students became quiet, they reminded the students to continue speaking with the use of specific verbal cues. At no time did proctors prompt students about problem solutions or respond to questions about the tasks. Each student was videotaped as he/she worked on the task.
As a reader, picture an adult sitting next to a 9-year old student as she talks about the problem she is solving. The student is working on the post-task with five colored tiles and is explaining how the shape she has just made can be rotated on the desk. She says, “It can face this way or this way or this way.” She continues to make different shapes and tell the adult how each can be rotated. She then turns to the paper and pencil to begin recording as she says, “I have found out 4 or 5 ways.” After she has made each shape, she numbers them and counts out loud as she does. She pauses and says, “I found another one. That makes 6.” She looks over her paper and says, “I don’t have that one,” and records the shape on her paper. She begins moving the tiles again. She says, “That makes seven,” but she continues moving one tile along the sides of the others. She says, “It (the tile) can come this way and that makes a new one.” This example is a section of a thirty minute response from a single child in the metacognitive class.
The teacher of the Metacognitive class sought to create a culture of thinking (Tishman, Perkins, and Jay, 1995) throughout the day by using several different instructional strategies. These strategies encompassed a two-pronged approach: (a) strategies that focused on raising student self-awareness, especially of their own thinking (Garofalo and Lester’s person knowledge); and (b) strategies that focused on planning, monitoring, and evaluating within problem-solving events (Garofalo and Lester’s strategy knowledge). The Non-Metacognitive teacher did not attempt to create such a culture and made no overt attempts to develop her students’ metacognitive skills.
It is important to note, however, that both Metacognitive and Non-Metacognitive classes received six weekly lessons provided by the teacher of the gifted and talented program during September and October. These lessons, a regular part of the gifted and talented program, presented Edward de Bono’s (1991) concept of thinking hats from Six Thinking Hats for Schools. Each of these lessons introduced a type of thinking mode with a picture of a colored hat, and students practiced that particular type of thinking during the half hour lesson. Metacognition was the subject of the last lesson. An analysis of lesson plans revealed that the thinking hat lessons were the only instruction on thinking skills in the Non-Metacognitive class.
In the Megacognitive class, these six lessons represented the beginning of the students’ exposure to instruction on metacognition. The Metacognitive teacher initiated a series of lessons on self-awareness in November and maintained focus on them throughout the year. She used a program called Second Step (Committee for Children, 1992) to help students build better understanding of their emotional selves. During lessons, students learned the technique of mindmapping (Margulies, 1991) which encouraged them to explore ideas about themselves, their private hopes and dreams, and their understanding of mathematics. Two lessons on the anatomy of the brain explained the effect of the “flight or fight” response of the amygdala and how several parts of the brain were used during thinking, learning, and problem solving.
During the last week of November, the Metacognitive teacher introduced a language of thinking (Tishman, Perkins, and Jay, 1995). She told the students that many kinds of thinking exist and that words can help them talk about specific kinds of thinking. The students and teacher generated a chart of thinking words like create, hypothesize, question, guess, opinion, evidence, and investigate. Blanks were left on the chart, and other words (e.g., plan, monitor, and evaluate) were added throughout the year as the class learned about them.
In January, the teacher began formal instruction in metacognition strategies. During mathematics lessons, she used modeling, explanations, group interactions, feedback and practice to help students learn how to plan, monitor, and evaluate their thinking. During problem-solving episodes, she encouraged students to take risks, to share failures and successes, and to find more than one way to approach a problem. Over the next several months, the teacher focused on four specific problem-solving strategies: look for a pattern, draw a picture, make a table, and guess and check. The teacher also developed metacognitive tools that encouraged students to: (a) plan before tackling a problem; (b) monitor while working on a problem; and (c) evaluate the success of solutions. These tools included planning sheets and checklists developed by the teacher. The teacher routinely asked students to share their monitoring processes with her while working on problems.
Two pairs of raters were trained to analyze the videotapes of the pre- and post-tasks of each student. One pair scored the students’ use of problem-solving; the other pair scored students’ use of metacognitive strategies. This focus on both cognitive and metacognitive processes is consistent with Lester’s (1985) suggestion that research on problem-solving should focus on both cognitive and metacognitive variables.
Through viewing the videotapes, one set of raters scored students’ problem-solving performance using two scoring guides, each respectively designed for the pre- and post-tasks. These guides can be found in Tables 3 and 4.
The scoring guides were based on Pelve’s (1957) analysis of problem-solving. Raters scored how well students demonstrated understanding of the problems, created strategies for solving the problem, and found correct solutions. Student scores could range from zero to eight, with three points maximum for understanding and correct solutions and two points maximum for using multiple strategies.
The other set of raters analyzed the videotape transcripts and scored the number of metacognitive strategies used by students in attempting to solve the pre- and post-tasks. They used the Metacognition Category System (METACATS) (Goldberg, 1996) to determine planning, monitoring, and evaluating strategies. (See Table 5) One point was given each time a student demonstrated use of one of these metacognitive skills in solving the problems.
An inter-rater reliability on the two pairs of raters was determined by using the Pearson correlation coefficient on samples of videotapes scored by both raters. The correlations ranged from .75 to .99, with only two correlations below .80. These reliability coefficients were deemed acceptable.
To determine whether or not differences between classes existed prior to instruction, t-tests for independent means were applied to the problem solving and metacognition pre-task results of both classes. To determine the effects of metacognition instruction on problem-solving and metacognitive skills, t-tests for independent means were applied to the problem-solving and metacognition post-task results of both classes.
How did student performance in problem-solving and metacognition compare across both classes prior to instruction in metacognitive skills? During the pre-task, student performance in problem-solving and use of metacognition statements were comparable for both classes. Neither Metacognitive nor Non-Metacognitive students made many statements that could be categorized as Planning, Monitoring, or Evaluating. Table 6 shows the comparison of problem-solving scores from each class during the pre-task:
The table reveals two important results. First, the mean scores for both classes were relatively low. Means for all three components of problem solving for both classes were less than one (out of a possible three points for Understanding and Solutions and two points for Strategies). Furthermore, the means for overall problem-solving performance of both classes was less than two (out of a possible total score of eight). Most importantly, the t-tests revealed that there were no significant differences (p < .05) between the two classes on any of the comparisons. The Metacognitive class scored slightly higher in understanding (mean difference = .37). The Non-Metacognitive class scored slightly higher in strategies (mean difference = .07) and in solutions (mean difference = .32). The means of the total scores were virtually the same. These results support the premise that the two classes were relatively equal with regard to problem-solving performance prior to instruction in metacognition.
Table 7 provides comparison between the classes with regard to the number of metacognitive statements made by students during the pre-task.
This table reveals similar results as those for problem solving. The mean scores for the six types of metacognitive skills indicated that students used very few metacognitive skills during the pre-task, especially with respect to Planning/Clarifying, Planning/Strategies, and Evaluate/Action. The means for these skills were virtually zero for both classes. The mean scores for Monitor/Review and Evaluate/Self were slightly higher, but each was still less than one. Clearly the students were not prone to use metacognitive skills during the pre-task because students in each class averaged slightly more than one metacognitive statement overall in completing the pre-task.
Most importantly again, the t-tests revealed that there were no significant differences (p < .05) between the two classes on any comparisons of metacognitive skills. The Metacognitive class scored slightly higher in Monitor/Regulate (mean difference = .01), Evaluate/ Self (mean difference = .3), and Evaluate/Action (mean difference = .05). The Non-Metacognitive class scored slightly higher in Planning/Clarify (mean difference = .04), Planning/Strategize (mean difference = .08), and Monitor/Review (mean difference = .1). These results clearly support the premise that the two classes were relatively equal with regard to use of metacognitive skills prior to instruction in metacognition.
How did instruction in metacognitive skills affect performance in problem solving? Given that students in both classes were comparable in the problem-solving performance, t-tests were conducted on the results of the post-tasks. Table 8 compares the two classes with respect to their problem-solving scores on the post-task.
The analysis revealed two significant differences between the classes. Students in the Metacognitive class scored significantly higher (p<.05) than students in the Non-Metacognitive class on understanding (mean difference = .69) and overall problem-solving performance (mean difference = 1.02). Furthermore, Metacognitive students scored higher, although not significantly, than their counterparts in the Non-Metacognitive class on strategies (mean difference = .22) and on solutions (mean difference = .08). It is important to note that, on the pre-tasks, the Non-Metacognitive class scored higher than the Megacognitive class on solutions (mean difference = .28).
How did instruction in metacognitive skills affect use of metacognitive skills in problem solving? The results of the analysis also revealed differences between the two classes in the use of metacognitive skills. Table 9 provides a summary of the use of metacognitive statements used by students in both classes during the post-task.
Students in the Metacognive class used significantly (p < .05) more Monitor/Review (mean difference = 6.03) statements and total statements (mean difference = 5.79) than Non-Metacognitive students on the post task. In addition, Metacognitive students used more Planning/Strategize statements (mean difference = .10), Monitor/Regulate statements (mean difference = .21), and more Evaluate/Action statements (mean difference = .06) than Non-Metacognitive students, but the differences were not significant. Finally, Non-Metacognitive students used more Evaluate/Self statements (mean difference = .08) than the Metacognitive students. As revealed in Table 3, however, this mean difference represents a decrease in the difference on the pre-task (mean difference = .30).
Students in the Metacognitive class made almost three times the number of metacognitive statements on the average than Non-Metacognitive students made on the post-task. Metacognitive students increased the average number of statements from the pre-task to post-task by 7.6, while the Non-Metacognitive students increased the average number of statements from the pre-task to post-task by about two. Clearly, instruction on metacognitive skills had an impact on the use of metacognition in problem solving, especially in the area of monitoring thinking.
Instruction in metacognitive skills, as described in this study, increased the metacognitive skills used by third-grade students and thereby improved their performance in mathematical problem solving. In particular, students learned to monitor their thinking more often during problem-solving episodes as a result of the instruction. Furthermore, students receiving instruction in metacognitive skills increased their planning and evaluation skills, but not substantially. Problem-solving performance improved throughout the process; however, the most significant improvement occurred in the area of understanding. As a result of instruction in metacognition, students improved significantly in their attempts to understand the problem and slightly in their use of strategies and formulation of solutions.
The Metacognitive class’s increase in monitoring during problem solving was not surprising. The activities provided by the Metacognitive teacher focused heavily on monitoring thinking and progress. Also, the Metacognitive teacher’s use of cooperative groups would seem to contribute to this finding. The process of monitoring thinking becomes public as students solve problems in groups. Therefore, students weak in this skill have the opportunity to see how other students use monitoring to solve problems. In any event, it is clear from the analysis that these students learned this skill well. Interestingly, students in the NonMetacognitive class also increased their monitoring skills. Evidently maturity or experiences with problems seemed to help with this skill naturally, although purposeful experiences seemed to yield results in the range of fourfold.
The most surprising result of the study was the almost total absence of planning by any student at any time. There was little evidence of planning in either the pre-task or post-task, despite a significant amount of instruction in planning in the Metacognitive class. At least two reasons for this finding are possible. First, and what we consider the most probable, is that students at this age are not inclined to engage in planning when posed with challenging problems. The Metacognitive teacher noted that even her best problem solvers tended to want to get “into” the problem immediately rather than step back and think about a plan. The impulsive nature of this age group would minimize reflection and pause at the outset of a problem-solving episode. This conjecture is hard to verify elsewhere because little research in metacognition has focused on students at this age.
Second, the tasks selected for the study simply might not have invoked a need to plan. Both tasks, visual in nature, are rather straightforward. Students simply can begin the problem by using trial and error to search for possibilities. Mathematics tasks that are more complex or have several components might require students to do more planning at the outset.
The finding that evaluation skills were rarely used during problem solving might also be attributed to similar reasons. Although students tended to evaluate themselves occasionally, they rarely evaluated their own actions, despite instruction that focused on these skills. Again, it may be that students at this age simply are not willing to do much thinking once they arrive at a solution. During the Metacognitive class, students were often satisfied with simply finding a solution without concern whether or not it was the solution. Often students are conditioned in mathematics lessons in earlier years to focus on getting the answer. Once an answer is obtained, it is time to go to the next problem. Also as with planning, these two tasks may not have been challenging enough to stimulate a need for students to evaluate their actions.
As with any research, this study raises some important questions about further research on teaching metacognitive skills. First, further research on the effects of age and development on a student’s ability to learn metacognitive skills is warranted. Many educators promote the teaching of metacognitive skills to students in elementary school; yet, there is little research that focuses on the metacognitive skills of young children. This study raises the possibility that students at this age may simply not be ready or able to learn about planning and evaluating in problem solving. Perhaps third graders are in the early stages of developing the self awareness required to be knowledgeable of their thinking. Since, children at this age are just learning to be aware of their own thinking, they may not be yet able to successfully apply this self-awareness to mathematics problem solving. The location of the beginning age during childhood where instruction in some metacognitive strategies would be developmentally appropriate is a worthy pursuit. According to Flavell (1994), children of seven to eight years of age show a better understanding of introspection than do younger children. Perhaps children of eight to nine years of age not only improve their understanding and use of introspection, but also respond more to outside influences, such as classroom instruction.
Further research should also focus on larger numbers of teachers and students. This study involved two teachers and 43 students in a small elementary school. Replication of the study with more teachers and students in a variety of schools would provide additional useful information. Using a variety of tasks, especially tasks outside mathematics, would provide further insight into the feasibility of teaching metacognitive skills in order to improve problem-solving performance in other areas.
Finally, the Metacognition Category System (METACATS) was a useful tool to describe students’ use of metacognitive skills during problem solving. It seemed to provide a structure to classify metacognitive skills in a clear and comprehensive way. The System is generic enough to be used in a variety of ways in research on metacognition.
Mathematical problem solving is a necessary skill for all students at all ages. Learning to become a problem solver is a life-long endeavor. This study centered on teaching metacognitive strategies to help young learners become good problem solvers. It revealed that instruction in metacognition may have a positive impact on student performance in problem solving. Some questions arise, however, with regard to the feasibility of spending time to develop metacognitive skills in young students. For this reason, further research in the metacognitive development of young students is certainly warranted.
TABLE 1 Characteristics of Students in the Metacognitive and
Class Gender Age CTBS Mean* Special Needs
Metacognitive M F 8 yr 9 yr 10 yr 59.3 Three only in
13 13 21 5 0 speech
Non- M F 8 yr 9 yr 10 yr 61.
Metacognitive 13 13 15 9 1 None
*California Test of Basic Skills
TABLE 2 Pre- and Post-Tasks as Administered to Students
On the table in front of you are On the table in front of you are
unifix cubes of two different five colored tiles. Each tile
colors. Each cube represents a represents a room in a house. Using
different floor of a unifix cube any arrangement of tiles, create all
tower. Using any arrangement of possible room arrangements that you
four different cubes, find out how can make with the five tiles. You
many possible towers can be made. may use paper and pencil if you
You may use paper and pencil if like.
TABLE 3 Scoring Guide for Pre-Task Data Analysis
0 1 2
Understanding Little or no Focuses on Demonstrates
understanding; number; Makes understanding
makes towers several correct of task with
which have towers but takes both number
more or less no notice of and examples;
than 4 cubes duplications Notices less
OR than/equal to
Focuses on half of
makes 1-2 AND
correct towers Makes several
and stops correct
Strategies Does not use a Uses a Uses multiple
discernible consistent strategies for
strategy; single, clear generating
Sequence of strategy for correct towers
towers is generating
random correct towers
Solutions Creates fewer Creates 8-11 Creates 12-15
than 8 correct correct and correct and
and different different towers different
of task with
than half of
Solutions Creates 16
______OTHER (towers which are incorrect, such as towers built with more
or less than 4 blocks)
Scores Understanding Problem ______
TABLE 4 Scoring Guide for Post-Task
0 1 2
Understanding Same as Table 3 Same as Table 3 Same as Table 3
Strategies Same as Table 3 Same as Table 3 Same as Table 3
Solutions Creates fewer Creates 9-13 Creates 14-17
than 9 correct correct and correct and
and different different houses different
Understanding Same as Table 3
Strategies Same as Table 3
Solutions Creates 18-19
______ OTHER (variations of arrangement which uses all tiles, but one or
more tile edge is not lined up with other tile edges.)
Scores Understanding Problem ______
TABLE 5 Metacognition Category System (METACATS)
PLAN/CLARIFY — The statement demonstrates the child is preparing to do
something and is thinking about the task before tackling it. For this
category the child could be restating the problem or rereading the
problem. Example: I’m going to read this again.
PLAN/STRATEGIZE — For this category the child could be planning a
strategy, attempting to identify a strategy, or announcing the
application of a particular problem solving strategy. Example: I gonna
try this. I think I’ll make a pattern.
MONITOR/REVIEW — This category of responses would demonstrate that the
child is engaged in a task and notices the success of her idea, or lack
of it, towards the task. She might review her work in order to avoid a
future mistake or find a current mistake. She might check her work for
errors or simply look over the work again (action noted by transcriber).
This category includes statements that would be categorized as planning,
if made before beginning the task, or evaluating, if made after
completing the task. Example: Now I found out four or five ways how to
do it (begins writing on paper to record ways).
MONITOR/SELF-REGULATE — The child might regulate herself by adjusting
her activity. Generally, this category involves a change in behavior;
however, the child may say she is going to make a change after reviewing
her work. Example: I wasn’t really counting.–First, I got four reds.
EVALUATE/SELF — This category of responses would demonstrate that the
child is passing judgment on herself. Example: I can’t think of anything
EVALUATE/ACTION OR PRODUCT — This category of responses would
Demonstrate that the child is passing judgment on actions or products.
Example: That’s the very last one.
TABLE 6 Comparison of Pre-Task Problem Solving Scores between
Metacognitive and Non-Metacognitive Classes
Component Class N Mean Standard T df P
Understanding Metacognitive 20 .85 .813
1.65 41 .107
Non-Metacognitive 23 .48 .665
Strategies Metacognitive 20 .50 .688
-.335 41 .740
Non-Megacognitive 23 .57 .590
Solutions Metacognitive 20 .55 .759
-1.155 41 .255
Non-Megacognitive 23 .87 1.013
Total Metacognitive 20 1.90 1.586
-.024 41 .981
Non-Megacognitive 23 1.91 1.905
TABLE 7 Comparison of Pre-Task Metacognition Scores between
Metacognitive and Non-Metacognitive Classes
Metacognitive Class N Mean Standard t Df p
Planning/ Metacognitive 20 .00 .000
-.931 41 .357
Non-Metacognitive 23 .04 .209
Planning/ Metacognitive 20 .05 .224
-.893 41 .377
Non-Metacognitive 23 .13 .344
Monitor/ Metacognitive 20 .55 .999
-.308 41 .759
Non-Metacognitive 23 .65 1.152
Monitor/ Metacognitive 20 .05 .224
.099 41 .922
Non-Metacognitive 23 .04 .209
Evaluate/ Metacognitive 20 .65 .671
1.42 41 .162
Non-Metacognitive 23 .35 .714
Evaluate/ Metacognitive 20 .05 .2236
1.074 41 .289
Non-Metacognitive 23 .00 .0000
Total Metacognitive 20 1.45 1.638
.463 41 .646
Non-Metacognitive 23 1.22 1.650
TABLE 8 Comparison of Post-Task Problem Solving Scores between
Metacognitive and Non-Metacognitive Classes
Component Class N Mean Standard t df p
Understanding Metacognitive 20 1.30 .865
2.857 41 .007
Non-Metacognitive 23 .61 .722
Strategies Metacognitive 20 .65 .671
1.120 41 .269
Non-Metacognitive 23 .43 .590
Solutions Metacognitive 20 .60 .821
.287 41 .775
Non-Metacognitive 23 .52 .947
Total Metacognitive 20 2.55 1.761
2.017 41 .050
Non-Metacognitive 23 1.57 1.441
TABLE 9 Comparison of Post-Task Metacognition Scores between
Metacognitive and Non-Metacognitive Classes
Metacognitive Class N Mean Standard t Df p
Planning/ Metacognitive 20 .00 .0000
—- — —
Non-Metacognitive 23 .00 .0000
Planning/ Metacognitive 20 .10 .308
1.561 1 .12
Non-Metacognitive 23 .00 .000 6
Monitor/ Metacognitive 20 8.50 10.506
2.580 41 .01
Non-Metacognitive 23 2.47 4
Monitor/ Metacognitive 20 .25 .639
1.466 41 .15
Non-Metacognitive 23 .04 .209 0
Evaluate/ Metacognitive 20 .04 .503
-.505 41 .61
Non-Metacognitive 23 .48 .511 6
Evaluate/ Metacognitive 20 .10 .308
.713 41 .48
Non-Metacognitive 23 .04 .209 0
Total Metacognitive 20 9.05 9.85
2.555 41 .01
Non-Metacognitive 23 3.26 4.29 4
Bookman, J. (1993). An expert novice study of metacognitive behavior in four types of mathematics problems. Primus, 3(3), 284-314.
Cai, J. 1994). A protocol-analytic study of metacognition in mathematical problem solving. Mathematics Education Research Journal, 6(2), 166-83.
Cardelle-Elawar, M. (1995). Effects of metacognitive instruction on low achievers in mathematics problems. Teaching and Teacher Education, 11(1), 81-95.
Charles, R.I., & Lester, F.K. (1984). An evaluation of a process-oriented mathematics problem-solving instructional program in grades five and seven. Journal for Research in Mathematics Education, 15(1), 15-34.
Committee for Children. (1992). Second step: A violence prevention curriculum. Seattle, WA.
Cooney, T.J. (1985). A beginning teacher’s view of problem solving. Journal for Research in Mathematics Education, 16(5), 324-36.
Costa, A.L. (Ed.). (1991). Developing minds. Alexandria, Virginia: Association for Supervision and Curriculum Development.
De Bono, E. (1991). Six thinking hats for schools. Logan, Iowa: Perfection Learning.
Dossey, J.A., (1993). Can students do mathematical problem solving? Results from constructed-response questions in NAEP’s 1992 Mathematics Assessment. Washington, D.C.: U.S. Government Printing Office.
Dossey, J.A., (1994). How school mathematics functions: Perspectives from the NAEP 1990 and 1992 assessments. Washington, D.C.: U.S. Government Printing Office.
Fernandez, M.L., Hadaway, N., Wilson, J.W. (1994). Connecting research to teaching: Problem solving–Managing it all. Mathematics Teacher, 87(3), 195-99.
Flavell, J.H. (1976). Metacognitive aspects of problem solving. In L.B. Resnick (Ed.). The nature of intelligence. Hillsdale, NJ: Lawrence Erlbaum.
Flavell, J.H. (1994). Young children’s understanding of thinking and consciousness. In R. P. Honk (Ed.). Introductory readings for cognitive psychology. Guilford, CT: Dushkin.
Fogarty, R. (1994). How to teach for metacognitive reflection. Palatine, IL: IRVSkylight.
Fortunato, I., Hecht, D., Tittle, C.K., & Alvarez, L. (1991). Metacognition and problem solving. Arithmetic Teacher, 39(4), 38-40.
Garofalo, J. (1986). Metacognitive knowledge and metacognitive processes: Important influences on mathematical performance. Research and teaching in developmental education, 2(April 1986), 34-39.
Garofalo, J. (1987). Megacognition and school mathematics. Arithmetic Teacher, 34(9), 22-23.
Garofalo, J., & Lester, F.K., Jr. (1985). Metacognition, cognitive monitoring, and mathematical performance. Journal for Research in Mathematics Education, 16(3), 163-176.
Goldberg, P. (1996, Winter). Metacognition: Creating a usable definition. Kentucky Association of Assessment Coordinators Newsletter, p. 4.
Gray, S.S. (1991). Ideas in practice: Metacognition and mathematical problem solving. Journal of Developmental Education, 14(3),24-26, 28.
Hiebert, J. (1999). Relationships between research and the NCTM standards. Journal for Research in Mathematics Education, 30, 3-19.
Horak, V.M. (1990, April). Students’ cognitive styles and their use of problem solving heuristics and metacognitive processes. Paper presented at the annual meeting of the National Council of Teachers of Mathematics.
Jones, B.F. & Idol, L. (Eds.). (1990). Dimensions of thinking and cognitive instruction. Hillsdale, NJ: Lawrence Erlbaum.
Kenney, P.A., & Silver, E.A. (Eds.) (1997). Results from the sixth mathematics assessment of the National Assessment of Educational Progress. Reston, VA: National Council of Teachers of Mathematics.
Lester, F.K. (1985). Methodological considerations in research on mathematical problem-solving instruction. In E.A. Silver (Ed.), Teaching and learning mathematical problem solving: Multiple research prospectives. (pp. 41-70). Hillsdale, NJ: Lawrence Erlbaum Associates.
Lester, F.K., & Garofalo, J. (1982, March). Metacognitive aspects of elementary school students’ performance on arithmetic tasks. Paper presented at the annual meeting of the American Educational Research Association, New York.
Lucangeli, D., Coi, G., & Bosco. P. (1997). Metacognitive awareness in good and poor mathematics problem solvers. Learning Disabilities Research and Practice, 12(4), 209-12.
Magsud, M. (1998). Effects of metacognitive instruction on mathematics achievement and attitudes toward mathematics of low mathematics achievers. Educational Research, 40(2), 237-43.
Margulies, M.A. (1991). Mapping inner space. Tucson, AZ: Zephyr.
Marzano, R., Pickering, D., et al. (1997) Dimensions of learning. Alexandria, Virginia: Association for Supervision and Curriculum Development.
Mathematical Sciences Education Board. (1989). Everybody counts: A report to the nation on the future of mathematics. Washington, D.C.: National Academy Press.
Montague, M. (1992). The effects of cognitive and metacognitive strategy instruction on the mathematical problem solving of middle school students with disabilities. Journal of Learning Disabilities, 25(4), 230-48.
National Council of Supervisors of Mathematics (1978). “Position statements on basic skills.” Mathematics Teacher, 71(6),147-52.
National Council of Teachers of Mathematics. (1980). Agenda for action: Recommendations for School Mathematics for the 80s. Reston, VA: Author.
National Council of Teachers of Mathematics. (1989). Curriculum and Evaluation Standards for School Mathematics. Reston, VA: Author.
National Council of Teachers of Mathematics. (2000). Principals and standards for school mathematics. Reston, VA: Author.
Perkins, D.N. (1992). Smart schools. New York: Free Press.
Perkins, D.N. (1995). Outsmarting IQ. New York: Free Press.
Polya, G. (1957). How to solve it. Princeton, NJ: Princeton University Press.
Resnick, L.B. & Klopfer, L. (Eds.). (1989). Toward the thinking curriculum: Current cognitive research. Alexandria, VA: Association for Supervision and Curriculum Development.
Schoenfeld, A.H. (1981, April). Episodes and executive decisions in mathematical problem solving. Paper presented at the annual meeting of the American Educational Research Association, Los Angeles.
Schoenfeld, A.H. (1989). Teaching mathematical thinking and problem solving. In L.B. Resnick & Klopfer, L. (Eds.). Toward the thinking curriculum: Current cognitive research. Alexandria, VA: Association for Supervision and Curriculum Development.
Silver, E.A. (1985). Research on teaching mathematical problem solving: Some under represented themes and needed directions. In E. A. Silver (ed.), Teaching and learning mathematical problem solving. Multiple research perspectives (pp. 247-66). Hillsdale, NJ: Lawrence Erlbaurn Associates.
Silver, E.A. (1998). Improving mathematics in middle school: Lessons from TIMSS and related research. Washington, D. C.: U. S. Government Printing Office.
Stigler, J.W., & Hiebert, J. (1997). Understanding and improving classroom mathematics instruction. Phi Delta Kappan, 79(1), 14-21.
Stillman, G.A., & Galbraith, P.L. (1998). Applying mathematics with real world connections: Metacognitive characteristics of secondary students. Educational Studies in Mathematics, 36(2), 157-95.
Swartz, R.J., & Parks, S. (1994). Infusing the teaching of critical and creative thinking into elementary instruction. Pacific Grove, CA: Critical Thinking Press & Software.
Thompson, A.G. (1989.) “Learning to teach mathematical problem solving: Changes in teachers’ conceptions and beliefs.” In R.I. Charles and E.A. Silver (eds.), The teaching and assessing of mathematical problem solving (pp. 323-343). Reston, VA: National Council of Teachers of Mathematics.
Tishman, S., Perkins, D., Jay, E. (1995). The thinking classroom: Learning and teaching in a culture of thinking. Boston, MA: Allyn and Bacon.
Patricia D. Goldberg
William S. Bush
University of Louisville
COPYRIGHT 2003 Center for Teaching – Learning of Mathematics
COPYRIGHT 2008 Gale, Cengage Learning