Research on Individual Differences Within a Sociocultural Perspective: Co-regulation and Adaptive Learning


by Mary McCaslin & Heidi Legg Burross - 2011

Background/Context: Research is presented on teacher-centered instruction and individual differences among students within a sociocultural perspective: specifically, within a co-regulation model.

Purpose of Study: To determine the utility of a co-regulation model for understanding teacher and student adaptation to the press of cultural and social demands for student achievement.

Research Design: Multiple methods were used and quantitative procedures applied to data obtained in Grades 3-5 classrooms (N = 47) in schools (N = 5) that primarily served students living in poverty and were engaged in comprehensive school reform. Data sources include observation of classroom practices (N = 108; mean = 2 observations per classroom) to identify differences in instructional opportunity within teacher-centered instruction; students (N = 439) reported self-monitoring of their classroom activity to ascertain individual differences among them in their adaptation to classroom demands; and student performance on classroom-like tasks (story writing; individual student unit of analysis) and standardized tests (SAT9 language, math, and reading subtests; grade-level unit of analysis) to illuminate the dynamics of opportunity, activity, and adaptation in student achievement.

Conclusions/Recommendations: Results support the potential of a co-regulation model to understand and enhance teacher-centered instruction of students who differ in adaptation to classroom demands and achievement expectations in nontrivial ways. The practicable instructional opportunities that most aligned with cultural demands for improving student performance on mandated tests was a basic form of direct instruction. Direct instruction appears to cast a wide safety net, including students who are and are not yet ready to profit from this mode of instruction as expressed by mandated test performance. Students not yet ready for culturally mandated performances are nonetheless acquiring desirable and personally meaningful adaptations to learning challenges that are co-regulated by direct instruction opportunities. Unfortunately, these students remain largely invisible to sociocultural policy makers who portray them as uninvested in, if not resistant to, school learning. It is reasonable to ask how long students will continue to participate in and adapt to classroom demands without cultural validation of that participation and recognition of the learners these students are and wish to become. It is time for deliberate examination of cultural beliefs and regulations that equate student performance on mandated tests with meaningful learning, a prepared future citizenry, and the effectiveness of the public school.

We are pleased to add our voices to this special issue on social approaches to self-regulation; our work is conducted within a sociocultural perspective. Here we pursue two major goals. The first is to broaden research foci and constructs within sociocultural perspectives to illustrate the inclusive power of this approach. For example, we hope to persuade readers that all classrooms are “learning communities,” not only those identified as “learner centered” or “small-group centered.” Teacher-centered classrooms are normative (i.e., typical) in the United States, especially in schools engaged in reform, and not all teacher-centered instruction looks or functions similarly (Pianta, Belsky, Houts, & Morrison, 2007). The sheer amount of time students spend in teacher-centered settings suggests it is critical that sociocultural perspectives bring their expertise to enhancing teacher-centered instruction.


We hope to persuade readers that lack of attention to individual differences among students within sociocultural perspectives unnecessarily limits the power of this approach to understand and support student learning and adaptation to/coping with classroom demands. Attention to students’ individual differences, however, does not obviate deliberate recognition of the cultural beliefs and expectations about who particular learners are and “should” become, and what that means for the kinds of schools and instructional opportunities they warrant. Indeed, failure to consider the larger cultural landscape of the purposes of schooling for whom severely limits researchers’ influence on critical policy discussions and decisions that affect students in profoundly important ways. A sociocultural perspective is uniquely suited to attend to these multiple sources of influence in classrooms and to improve our understanding of the relationships and dynamics that unfold.


Our second goal makes a case for multiple methods, including quantitative procedures, within a sociocultural approach to understanding classroom dynamics. The study we present includes standardized classroom observations, student self-reports, and achievement measures. We hope to present a compelling case for the suitability of diverse data sources and quantitative analysis procedures in a sociocultural perspective.


THEORETICAL FRAMEWORK


Our research is guided by a co-regulation model of identity that is based on an emergent interaction perspective (McCaslin & Murdock, 1991; Wertsch & Stone, 1985). The co-regulation model has been described in detail elsewhere; here we briefly highlight key points and refer the reader to McCaslin (2009) for more complete explication. The co-regulation model proposed by McCaslin asserts that “identity is at the heart of multiple and simultaneous personal, cultural, and social influences that together reciprocally press and co-regulate—challenge, shape, and guide—emergent identity. The origins of identity are situated across the sources of influence and relationships and tensions among them” (McCaslin, 2009, p. 140). Each influence—personal, cultural, and social—is well-represented as sources of influence in the social and behavioral science literatures (e.g., psychology, anthropology, sociology). In a co-regulation model, however, the primary importance of each source of influence is its relationship to other sources. It is the reciprocal press among these sources of influence that yields the potential challenges, struggles, compromises that co-regulate emergent identity.


For example, personal influences in a co-regulation model are about individual readiness and potential. Personal resources include biology and dispositions, as they “are” to the individual, expected by the culture, and socially validated (or not). Social influences include the opportunities and relationships that are practicable, that can and do influence how people cope with and adapt to everyday experiences. Finally, cultural influences set norms and challenges that define what is probable for persons and social and cultural institutions. Probable is malleable nonetheless because personal and social influences can resist or work to change cultural norms and expectations. No source of influence—personal, social, and cultural—is equally distributed. One result, then, is differential opportunity for culturally valued, socially validated, personally desirable adaptive learning. In this article, we examine these dynamic relationships and tensions in classrooms and explore how they inform students’ adaptive learning.


Adaptive learning is about acting on your self and your situation to better meet your goals. It includes the socialization/internalization of goals, the motivation to commit to or challenge them, and the competence to realize and evaluate those commitments (McCaslin & Murdock, 1991; Rohrkemper & Corno, 1988). This perspective insists that humans, by their very nature, are social beings who are both biologically predisposed and socialized to participate in social settings (Geary, 2002; Olson & McCaslin, 2008). “Self,” then, is embedded in—adapted to—the “social,” which includes opportunities and relationships that in turn afford participation and interpersonal validation. We prefer the term adaptive learning to connote this essential self/social relationship that has implication for the co-regulated nature of what other approaches term individual or “self-regulation” (McCaslin, Bozack, et al., 2006; cf. Zimmerman & Schunk, 1989, 2001).


FROM THEORY TO RESEARCH


We focus on co-regulation dynamics of teacher-centered instructional practices and student activity in Grades 3–5 classrooms. First, we derive four types of teacher-centered instructional opportunities from observations of classroom practices. Second, we examine how students differentially adapt to the varied instructional demands and supports available in each type. Third, we consider the potential of students’ adaptive learning to become relatively stable “student characteristics” that inform achievement and co-regulate teacher-centered classroom learning communities.


We ground our study in two ways. First, it occurs within a cultural-social press (i.e., probable—practicable tensions) of federal demands for student achievement in the form of national law (No Child Left Behind [NCLB] Act of 2001, 2002). NCLB is manifested particularly in schools that serve students who live in poverty and that are engaged in school reform (see McCaslin & Good, 2008). Second, our study is grounded in a cultural-personal press (i.e., probable—potential tensions) that is represented by students who are coping with the dual demands of poverty and mobility  and who are learning about the purpose of and their place in school (see McCaslin & Burross, 2008). Our specific focus, however, is on the co-regulation of the social-personal press of classrooms (i.e., practicable—potential tensions), which we represent as instructional opportunity and student activity.


METHODS


PARTICIPANTS


Schools. Five schools engaged in Comprehensive School Reform (CSR) that were part of a larger investigation of school reform (McCaslin & Good, 2008; McCaslin, Good, et al., 2006) participated. The progress of participating schools (N = 5) in increasing student performance on mandated tests was labeled by the state as improving (n = 2), maintaining (n = 1), or underperforming (n = 2) in the year prior to the study. Each school primarily served students who lived in poverty; however, the concentration of that poverty (defined by the percentage of students receiving free and reduced lunch) ranged from 65% to 94%, mean = 77%. Schools also served mobile populations. Student mobility (defined as the sum of student transfers in and student transfers out) ranged from 27% to 50%, mean = 35%. Thus, participating schools served many students who faced the challenges of both poverty and mobility at home, which also impacted the stability and resources at school (see McCaslin & Burross, 2008).


Grade levels. We were interested in the instructional opportunities available to students in Grades 3–5. These grades coincide with Erikson’s (1968) “industry v inferiority” stage of psychosocial development that is “resolved” with a fundamental optimism about one’s competence. Thus, theoretically, from a developmental task perspective, this is an ideal window for the study of student motivation and learning. The developmental task for these students is to master the demands and expectations of schooling in ways that positively influence their emergent identity as the learners they are and wish to become. Grades 3–5 are represented in each school, with one exception; one participating school did not enroll Grade 3 students. Grade-level analyses, then, are based on N = 14 (three grade levels in each of four schools, two grades in one school).


Classrooms. A primary goal of the larger investigation was to describe classroom practices in Grades 3–5 in schools engaged in CSR. One CSR goal is that teachers within a given grade are engaged in shared curricula and instructional practices that inform and are informed by the curricula and instruction of other grade levels. Our concern was with teaching not teachers; thus, our observations were not scheduled by individual teacher availability. Instead, we organized our efforts by subject matter instruction, primarily reading and math; project staff observed multiple teachers on a given visit. Teachers typically were unaware of the timing of our visits. Observation procedures, reliability training, and results are described in detail elsewhere (McCaslin, Good, et al., 2006). In brief, data were obtained in a series of 10-minute observation intervals; each interval was followed by completion of a standard coding system comprising dichotomous low-inference descriptive variables, which was implemented with an overall percent exact agreement of 80% among observers. Key findings can be summarized succinctly: Classroom practices primarily involved teacher-centered instruction in well-managed and friendly classrooms in which most students were on task most of the time.


In this study, we look within these general findings in an attempt to capture nuances in teacher-centered instruction in the five participating schools. We ask if there are differences within teacher-centered instruction that might play a role in—co-regulate—student activity and participation that meaningfully inform our observations of students' engagement and students' self-reports of their classroom behavior. We observed 47 different classrooms in the five participating schools for a total of 108 visits (mean = 2), which resulted in 710 ten-minute observation intervals (mean = 6.57). Our representation of “instructional opportunity” is based on factor analyses of teacher-centered instructional practices (described subsequently), which in turn are based on these observation intervals. Our co-regulation analyses (the press between instructional opportunity and student adaptation) are based on data obtained in 33 classrooms in which both the teacher was observed and students participated in the self-report instrumentation. Thus, analyses of classroom observations and student reports are based on N = 33.


Students. Students (N = 439) in these five schools who received parent or guardian consent and assented to participate completed self-monitoring instrumentation at midyear. They represent approximately one third of the total student population and are the unit of analysis in determining individual differences among students’ initial self-monitoring reports. In late spring, 353 students in each of four schools completed self-monitoring reports. Replication of individual differences analyses are based on N = 353; longitudinal stability analyses are based on a subgroup of these students, who participated both in midyear and late spring (N = 309). Observations in classrooms (e.g., of student time on task) and standardized test (SAT9) results by grade are based on all students in attendance on a given day.


INSTRUMENTATION AND ANALYSIS PROCEDURES


Observation. The CSR Classroom Observation System (CSRCOS; Nichols, McCaslin, & Good, 2003) contains variables typically derived from process-product research conducted in the 1970s and 1980s that linked teacher behavior with student outcomes (see Brophy & Good, 1986). This includes variables such as proactive/reactive classroom management, instructional feedback, student time on task, and teacher expectations. CSRCOS also includes variables that reflect more recent concerns about classroom practices (e.g., opportunities for student choice and inquiry, and teacher-student cohesion, or “we-ness”). Observation variables were organized into three clusters derived from a co-regulation model of classroom dynamics: instructional opportunity, student activity, and teacher and student relationships (McCaslin, Good, et al., 2006). Our particular interest here is in the types of instructional opportunities available to students, and if and how they engaged them.


We define instructional opportunity as the cognitive demands of classroom lessons, teacher question-asking behavior, and additional teacher guidance of student learning. Cognitive demands codes were organized into four categories: can’t rate, basic facts and skills, basic facts and skills “mixed” with close elaboration and related thinking, and thinking/reasoning per se. We attempted to distinguish cognitive demand from apparent difficulty. For example, basic facts and skills acquisition typically is considered a lower level cognitive demand; however, it does not, by definition, represent a lower level of difficulty for (all) students. Our attempts to make this distinction, however, presented a level of difficulty for coders such that “can’t rate” was made available to them. Teacher question-asking included five categories describing if questions were asked, and if so, their apparent purpose: questions not evident; task managerial/procedural (“how to”) questions only; task managerial/procedural and correctness questions; task correctness only; and thinking/elaboration questions. Finally, we included the presence/absence of teacher guidance (e.g., “walking” students through tasks) to capture when whole-class instruction might be tailored to individual students.


Observation data on instructional opportunity were subjected to an exploratory factor analysis to see if distinct patterns of—opportunities within—teacher-centered instruction emerged. Instructional opportunity factors then were correlated with observed student activity, which was described with three categories of behavior: percentage of students “on task,” level of productivity, and student question-asking (defined to parallel teacher question-asking behavior). Instructional opportunity factors also were correlated with students’ self-reported classroom activity (also subjected to factor analysis procedures) and achievement, described subsequently. These analyses allowed us to examine potential differences in how students appear to engage (i.e., adapt to) instructional opportunities and student self- awareness, and their relation to achievement.


Self-monitoring reports. Research consistently finds that academic achievement is linked to students’ effective regulation of and reflection on their learning and that students differ markedly in their ability or willingness to do so (see Arnold, 2005). Our early attempts to capture student awareness of, knowledge of, and beliefs about their potential to regulate their own learning led us to focus more on student capability than willingness. Strategic regulation and reflection constructs were not (yet) part of many students’ explicit “tool kit” (McCaslin & Good, 2005; see also, Rohrkemper, Slavin, & McCauley, 1983). However, students were able to describe themselves in class in ways that suggested the viability of self-monitoring as a precursor to knowledge-monitoring (e.g., Arnold; Tobias & Everson, 1996) and “self”-regulated learning (e.g., Winne, 2001). We reasoned that student body and behavior awareness were useful benchmarks for the development of self-reflection and control that teachers could deliberately co-regulate through classroom relationships, management structures, and learning opportunities. Student self-monitoring reports also provided an additional perspective on classroom communities as contributed to and mediated by participants.


Student self-monitoring reports were captured with “How I am in my class,” a set of 20 sentences that represent a continuum of student classroom activity derived from previous observations of and interviews with students. The continuum varies the potential contribution of an individual’s behavior to own and others’ learning opportunities. For example, as reported in McCaslin et al. (1994), the enhancing scale (alpha = .93) items (n = 8) describe active positive contributions (“I was helping”; “I was doing my part”), the interfering scale (alpha = .94) items (n = 8) describe self-involvement that interferes with or limits contribution to own and others’ learning (“My head hurt”; “I kept getting stuck”), and the neutral scale (alpha = .83) items (n = 4) describe behavior that neither actively supports nor hinders own and others’ learning (“I was looking around”; “I was listening”).


Internal consistency (alpha) results in the present study (enhancing = .78; interfering = .77; neutral = .51) are not as strong as the original, suggesting that differences in studies should be considered. The original study participants were fourth-grade students described as mostly from two-parent middle-class families of Christian/Judaic religious orientation; in comparison, the students in this study are in Grades 3–5 and from low-income homes, and many cope with the disruption of family mobility. Second, the original study required students to describe themselves in their small group; this study required students to describe themselves in class. This suggests at least two hypotheses about the differences in internal consistencies between studies. One focuses on the press of the probable; namely, the resources and opportunities of the middle class are more aligned with the sociocultural press for learning in school than are the resources and opportunities of the poor—and their children know the “student script” more than their less advantaged peers. A second hypothesis concerns the practicable differences between the studies; namely, students may be more aware of their behavior in small groups than in whole-class settings. It is reasonable to conclude that students’ self-monitoring reports in the present study are less coherently differentiated by the three scales than were those in the original.


As in the original study, the 20 items were read aloud to students to control for reading readiness and inflection. In midyear and again in late spring, students were told to underline any sentences that described how they were in class prior to coming to the cafeteria (or library or wherever) to be in the study. Reports were intended to be a description of specific, recent classroom activity (“episodic” memory); however, student reports instead (or also) may have represented their prototypical classroom participation (“repisodic” memory; Neisser, 1982).


Student self-monitoring reports at Time 1 and Time 2 were each subjected to exploratory factor analysis to see if patterns emerged across items that might inform individual differences in student adaptation to classroom demands. Correlational analyses of obtained factors at Times 1 and 2 examined the extent to which reported adaptations were similar, suggesting the potential emergence of “characteristic” adaptations. Student adaptations were then related to performance on classroom-like tasks and annual standardized tests. Finally, student adaptations, based on student self-report, were correlated with instructional opportunity factors, based on classroom observation data. These analyses allow us to consider the co-regulation of instructional opportunity and student activity in the promotion of student achievement.


RESULTS AND DISCUSSION


CLASSROOM OBSERVATION


Instructional opportunity and student activity were each defined by standardized observation and coding of three types of variables. Instructional opportunity included the cognitive demand of lessons/tasks, teacher question-asking behavior, and teacher guidance. Student activity included time on task, level of productivity, and student question-asking behavior.


Variation in teacher-centered instructional opportunity. Ratios of the number of times a code was observed for each teacher were used to create the variables. As a result, each teacher had one value ranging from 0 to 1 for each observation code (e.g., cognitive demands: basic facts and skills). These variables were reduced further through exploratory factor analyses using principal components extraction with orthogonal varimax rotation in SPSS (Aspelmeier & Pierce, 2009). The orthogonal solution provided uncorrelated, unique factors. Factor retention was determined by (a) a scree plot analysis of the variance accounted for by the factors, (b) the theoretical interpretative value of those factors that exceeded an eigenvalue of 1.0, and (c) an examination of the item loadings. Items that loaded higher than .30 (or less than -.30) were examined for interpretation of the factor. The analyses yielded a standardized individual factor score for each teacher.


The first four factors were retained and together accounted for 72% of the variance in instructional, and hence, student learning opportunities. The factors, in rank order: Guided Elaboration, Direct Instruction, Review, and Structured Problem-Solving. As Table 1 illustrates, teachers appear to integrate the cognitive demands of the curriculum with their instructional strategies. Guided Elaboration involves tasks with cognitive demands that include and go beyond basic facts and skills, integrating close elaboration, related thinking, and more abstract reasoning. These cognitive demands are presented in an instructional context of teacher guidance and questions that encourage student thinking. This factor appears consistent with the instructional approach to “meaningful verbal learning” espoused by Ausubel (Ausubel & Fitzgerald, 1961). Direct Instruction consists of basic facts and skills curricula and an instructional press for correctness via teacher questions. This factor is consistent with the instructional approach to “intellectual skills” acquisition supported by Gagne (1985) and a basic version of direct instruction as described by Rosenshine and Stephens (1986). Review is just that, as teachers ask task managerial/procedural and correctness questions to assess student retention. Structured Problem-Solving aligns task demands focused on student thinking/reasoning with teacher question-asking that targets task managerial/procedural knowledge. In sum, the four instructional opportunity factors represent four independent orchestrations of the timing and function of teacher scaffolding within teacher-centered instruction.


Table 1. Instructional Opportunities: Factors, Sample Items, and Loadings


Instructional opportunity factors

(% variance)


Items

Loadings

1. Guided Elaboration (26.33%)

Cognitive demands—mixture

.781

Teacher guidance—present

.779

Cognitive demands—not evident

-.689

Teacher questions—thinking included

.675

Teacher questions—task managerial/procedural

-.424

Cognitive demands—thinking/reasoning

.326

2. Direct Instruction (20.82%)

Cognitive demands—basic facts & skills

.871

Teacher questions—correctness only

.799

Cognitive demands—not evident

-.484

Cognitive demands—mixture

-.384

Teacher questions—not evident

-.322

3. Review

(14.96%)

Teacher questions—task managerial/procedural & correctness

.836

Teacher questions—not evident

-.832

4. Structured Problem-Solving

(10.08%)

Teacher questions—task managerial/procedural

.715

Cognitive demands—thinking/reasoning

.628

Teacher questions—not evident

-.311


N = 710 ten-minute observation intervals.


Instructional opportunity and student activity. Classroom observations of student activity variables (including percentage of students on task, level of productivity, and presence/type of question-asking) were each correlated with the instructional opportunity factors. There were no significant relationships between type of instructional opportunity and observations of student time on task or productivity. This was expected because observation data described most students on task, productively, most of the time. Differences in student activity were evident in student question-asking behavior, however. Student question-asking about task managerial/procedural issues was associated with Structured Problem-Solving instructional opportunity high ratios (r = .47, p = .001), suggesting that this instructional opportunity engaged students in relatively more assistance-seeking, perhaps to improve their understanding or to avoid self-reliance, both potential indicators of strategic adaptive learning.


In sum, observation data suggest that students behave similarly (i.e., mostly on task and productive) in diverse instructional opportunities. However, students appear to seek more instructional support when their teachers engage in the more abstract Structured Problem-Solving approach to instruction.


Instructional opportunity and student performance on SAT9 subtests. We did not have access to individual student test scores; thus, we examined the relationship between observed type of teacher-centered instruction and student performance on the SAT9 using correlational procedures by grade level within school (N = 14). Direct Instruction was significantly related to each of the SAT9 subtests (language: r = .74, p = .003; math: r = .63, p = .015; reading: r = .66, p = .01). There were no other significant relationships between type of observed instructional opportunity and student performance; however, it is useful to note the pattern of correlations associated with each displayed in Table 2. No other instructional opportunity was positively correlated with SAT9 subtest performance except Review and language performance (r = .07). All remaining correlations were negative, notably those associated with Guided Elaboration.


In sum, instructional opportunities appear to differentially promote (or hinder) student performance on the SAT9 subtests. Direct Instruction best negotiates the press of these students’ potential (e.g., readiness, participation) and sociocultural demands for increasing their performance on mandated tests such as the SAT9. Thus far, we have presented differences in teacher-centered instructional opportunities in relation to students as a group or “community of learners.” We now turn our attention to students and ask, What does it mean to be a member of, to participate in, this community of learners? Do students differ in how they participate, that is, do students differ in their adaptation to classroom demands?


Table 2. Correlations of SAT9 Scores With Instructional Opportunity Factors


 Sig.

 Sig.

 Sig.

 SAT9 scores

Instructional Opportunity Factors

Guided Elaboration

Direct Instruction

Review

Structured Problem-Solving

SAT9 language

r

-.30

.74*

.07

-.16

 

.29

.01

.82

.60

SAT9 math

r

-.46

.63*

-.08

-.11

 

.10

.02

.78

.71

SAT9 reading

r

-.37

.66*

-.09

-.08

 

.19

.01

.76

.78


N = 14 grades within schools.


STUDENT SELF-MONITORING REPORTS


Students completed self-monitoring reports twice, at midyear and again in late spring. Table 3 contains the frequencies and percentages of students who underlined an item as descriptive of them in class prior to study participation at each time. A few points are noteworthy. First, students did not appear overly driven by social desirability in their responses. Only three items were endorsed by 80% or more of students: “I was working” (89% at Time 1, 90% at Time 2); “I was listening” (85%, 82%); and “I was ready” (80%, 82%). Each of these self-report item percentages is consistent with classroom observations of student activity in the larger study, in which 75% or more of the students were observed to be on task, productively, 86% of the time (McCaslin, Good, et al., 2006).


Second, the level of endorsement of somatic items is troubling. These are Grades 3–5 students, and many report being tired (47% at Time 1; 51% at Time 2) and hungry (43%, 41%). Reports suggesting illness and/or anxiety—dry mouth, stomach ache, headache, shaky hands, and flushed face—involved 28%, 23%, 20%, 11%, and 8% of the students, respectively. Reported frequencies and percentages were similar in the spring. Third, it is interesting to consider what a teacher or classmate can potentially “see” about another’s personal experience and participation and what that might mean for the learning community. One half of the items represent students’ “own” experiences that are readily visible to others and as such are likely to play a role in onlookers’ experiences as well. This seems an important consideration in understanding potential dynamics within learning communities.


Table 3. Self-Monitoring Report Frequencies (and Percentages) for Times 1 and 2


Self-Monitoring Items

Time 1*

Time 2**

I was working.

389 (89.0%)

316 (90.0%)

I was listening.

371 (84.9%)

288 (82.1%)

I was ready.

351 (80.3%)

289 (82.3%)

I was doing my part.

308 (70.5%)

262 (74.6%)

I was smiling.

295 (67.5%)

220 (62.7%)

I was sitting in my group.

282 (64.5%)

273 (77.8%)

I was watching.

267 (61.1%)

230 (65.5%)

I was into it.

237 (54.2%)

197 (56.1%)

I was helping.

217 (49.7%)

176 (50.1%)

I was looking around.

208 (47.6%)

174 (49.6%)

I was tired.

207 (47.4%)

181 (51.6%)

I was talking.

206 (47.1%)

183 (52.1%)

I was hungry.

188 (43.0%)

145 (41.3%)

I was getting help.

129 (29.5%)

102 (29.1%)

My mouth was dry.

121 (27.7%)

93 (26.5%)

My stomach felt funny.

102 (23.3%)

79 (22.5%)

My head hurt.

86 (19.7%)

69 (19.7%)

I kept getting stuck.

81 (18.5%)

63 (17.9%)

My hands were shaky.

49 (11.2%)

29 (8.3%)

My face was red.

34 (7.8%)

17 (4.8%)


* Time 1 N = 437 students with 2 nonresponses (percentages are of valid responses).

** Time 2 N = 351 students with 2 nonresponses (percentages are of valid responses).


Individual differences in self-monitoring reports. We were interested in how student reports might differ among students. Student self-monitoring reports completed at midyear (N= 439) were subjected to exploratory factor analysis using the same procedures previously described. The first five factors were retained and together accounted for 47% of the variance in student reports (see Table 4). In order of magnitude, the factors were: Anxious and Withdrawn; Good Worker; Engaged Learner; Disengaged and Distracting; and Struggling and Persistent. The three example items for each factor in Table 4 are those with the highest loadings. These factor analyses provided individual student scores for each of the retained factors. The factors, like the factors for the classroom-level analyses, are standardized variables used in subsequent analyses. Differences in student self-monitoring factors suggest that there may be important differences in the quality and salience of student adaptation to instructional opportunities. These differences also likely enhance (e.g., good worker, engaged learner, struggling and persistent) or detract from (e.g., anxious and withdrawn, disengaged and distracting) the learning community.


Table 4. Individual Differences in Students’ Approach to Self-Monitoring at Time 1: Factor Labels, Sample Items, and Loadings


Self-Monitoring Time 1 Factors

(% variance)

Items

Loadings

1. Anxious and Withdrawn

(16.34%)

My stomach felt funny.

.676

My mouth was dry.

.597

My head hurt.

.544

2. Good Worker

(12.61%)

I was listening.

.661

I was working.

.602

I was ready.

.580

3. Engaged Learner

(6.79%)

I was smiling.

.636

I was helping.

.635

I was getting help.

.496

4. Disengaged and Distracting

(5.61%)

I was talking.

.812

I was looking around.

.627

I was tired.

.459

5. Struggling and Persistent

(5.37%)

I kept getting stuck.

.756

I was getting help.

.512

My hands were shaky.

.431


N = 439 students.


Replication of self-monitoring factors. We were able to replicate self-monitoring reports with students (N = 353) in four of the five participating schools. Student reports obtained in late spring (Time 2) were subjected to the same exploratory factor analysis procedures as Time 1. Five factors emerged that together accounted for 50% of the variance in student reports (see Table 5). The factors in rank order: Good Student and Worker2; Anxious and Withdrawn2; Disengaged and Distracting2; Engaged2; Struggling and Persistent2. The original factors replicated with minor exceptions. First, the Good Worker factor included elaborations suggesting a Good Student And Worker2 factor. Second, the factors differ in rank order, such that the Good Student and Worker2 factor accounted for a higher percentage of the variance in student reports at Time 2 than the Anxious and Withdrawn2 factor. There also were some item loading differences between Times 1 and 2; we note this by adding a 2 to the Time 2 factor labels.


Table 5. Individual Differences in Students’ Approach to Self-Monitoring at Time 2: Factor Labels, Sample Items, and Loadings


Self-Monitoring Time 2 Factors

(% variance)

Items

Loadings

1. Good Student & Worker2

(18.32%)

I was listening.

.755

I was doing my part.

.736

I was working.

.640

2. Anxious & Withdrawn2

(13.11%)

My mouth was dry.

.683

My stomach felt funny.

.661

My head hurt.

.613

3. Disengaged & Distracting2

(6.68%)

I was looking around.

.726

I was talking.

.641

I was tired.

.446

4. Engaged Learner2

(6.31%)

I was smiling.

.670

I was watching.

.462

I was into it.

.415

5. Struggling & Persistent2

(5.20%)

I was getting help.

.781

I kept getting stuck.

.585

I was helping.

.416

N = 353 students.


Stability of self-monitoring factors. The exploratory factor analyses of midyear and late spring self-monitoring reports provide a snapshot of how students, as a group, appear to differ in their adaptation to instructional opportunities. We now ask, are these differences individually stable? The self-monitoring factor scores of students who participated at both Times 1 and 2 (N = 309) were subjected to correlational procedures. Results displayed in Table 6 suggest some school-year stability in student self-monitoring reports, especially among the less optimal Anxious and Withdrawn (r = .46, p = .01) and Disengaged and Distracting (r = .46, p = .01) students.


Remaining stability indicators also are noteworthy. First, the correlations among Good Worker and Engaged Learner factors suggest a cluster of adaptations that mesh well with an achievement press, although we suspect that the differences between experiencing instructional opportunities as work related and learning related are not small. Second, there is some indication that students who struggle and persist and receive help at midyear continue to struggle and receive, and now give, help at year’s end. This seems an especially healthy adaptation to the frustration of learning difficulty and merits continued study (see also Rohrkemper & Corno, 1988).


In sum, our data suggest that students differentially participate in—adapt to—classroom opportunities and relationships. Stability analyses suggest that these adaptations may play a part in students’ resolution of the “industry v inferiority” crisis that informs a basic optimism about one’s competence (Erikson, 1968). We wanted to explore these dynamics further. Specifically, we wondered about the part that “readiness” might play in student adaptations to classroom demands; in Erikson’s terms, how might competence inform the industry v inferiority crisis?


Table 6. Correlations Between Self-Monitoring Factors at Times 1 and 2


  

Self-Monitoring Time 1

Self-Monitoring Time 2

 

Anxious & Withdrawn

Good Worker

Engaged Learner

Disengaged & Distracting

Struggling & Persistent

Anxious & Withdrawn2

r

.47*

.02

-.01

-.04

.05

Sig.

.00

.72

.93

.45

.40

Good Student & Worker2

r

-.02

.36*

.18*

-.22*

.05

Sig.

.77

.00

.00

.00

.42

Engaged Learner2

r

-.16*

.13*

.16*

-.04

.06

Sig.

.01

.02

.01

.50

.27

Disengaged & Distracting2

r

.08

.03

-.08

.46*

-.03

Sig.

.16

.57

.15

.00

.56

Struggling & Persistent2

r

-.01

-.05

.14*

.01

.29*

Sig.

.85

.41

.03

.83

.00


N = 306 students.


SELF-MONITORING AND STUDENT PERFORMANCE


Self-monitoring factors suggest differential student adaptation to the demands of classroom learning; however, student self-monitoring reports may also represent the press of differential readiness to profit from classroom learning opportunities. We examined how student self-monitoring might inform the press between the demands of classroom learning and student readiness in two ways. First, we attempted to represent one type of student performance that teachers routinely assess as part of their classroom practices. We reasoned that teachers’ feedback on classroom assignments may be a salient component of their co-regulation of student self-awareness, learning, and motivation. We designed a writing assignment, story writing, similar to those used by teachers as part of their instruction in language arts.


In late spring, after standardized testing, a subgroup of participating students (N = 330) wrote stories (N = 753) in response to each of (up to) three pictures of routine classroom events (students in a small group, a student working alone, a student with a teacher). We were able to score these stories for story development, vocabulary use, and basic spelling and grammar skills. Our goal was to capture a continuum of student performance that distinguished thinking/elaboration (story development), vocabulary acquisition and use (word count), and basic facts and skills (grammar, spelling). (See Bozack & Hummel, 2005, for detailed presentation of coding system development, training, and reliability procedures.) We then averaged each student’s scores for development, word count, number of basic skills errors, and basic skills errors/word count ratio across stories. Each averaged score was correlated with student self-monitoring report, which was obtained after completion of the writing assignment. Students again were instructed to describe themselves in their class that day prior to study participation.


Second, we linked student self-monitoring at midyear (N = 439), aggregated at the grade level (N = 14), with performance on the SAT9 in the spring. We reasoned that student self-monitoring reports at midyear were more representative of typical student classroom participation than reports obtained in late spring, after standardized testing. We used grade-level within-school analyses because we did not have access to SAT9 results at the individual student level. Thus, we cannot directly answer the question of the co-regulation of instructional opportunity, personal readiness, and student adaptation to classroom (i.e., teacher assignments) and sociocultural (i.e., performance on mandated tests) demands with our data. However, we can explore these possibilities in ways that may inform future efforts to link them.


Self-monitoring (Time 2) and writing assessments. Table 7 displays consistent, albeit quite modest, significant relationships between students’ self-monitoring and quality of writing assessment. Namely, Anxious and Withdrawn was associated with basic skills errors, and Good Student and Worker with higher story development, higher word counts (at p = .08), and lower basic skills errors/word count ratio. Engaged Learner was associated with higher story development and higher basic skills errors/word count ratio, and Disengaged and Distracting with lower story development. Finally, Struggling and Persistent was associated with lower story development yet higher word counts. In sum, student self-monitoring reports modestly yet coherently inform their performance on classroom-like tasks.


Table 7. Correlations of Story Codes With Self-Monitoring Factors at Time 2


  

Self-Monitoring Time 2

Story Variables

 

Anxious & Withdrawn2

Good Student & Worker2

Engaged Learner2

Disengaged & Distracting2

Struggling & Persistent2

Development— overall

r

-.08

.13*

.14*

-.13*

-.13*

Sig.

.15

.03

.02

.03

.03

Word count— overall

r

-.04

.10

-.07

.05

.19*

Sig.

.52

.08

.22

.35

.00

Error count—overall

r

.20*

-.09

.05

-.05

-.07

Sig.

.00

.12

.35

.35

.22

Ratio of errors/words—overall

r

.09

-.13*

.15*

-.07

.04

Sig.

.12

.02

.01

.25

.51


N = 306 students.


Self-monitoring and performance on SAT9 subtests. Student performance (by grade level) on SAT9 subtests was significantly related to three student self-monitoring factors (see Table 8). The Anxious and Withdrawn factor was associated with lower student performance on each SAT9 subtest (r = -.55 language, r = -.73 math, r = -.68 reading). In contrast, the Struggling and Persistent factor was associated with higher student performance on each of the SAT9 subtests (r = .56 language, r = .66 math, r = .68 reading). Disengaged and Distracting was associated with higher performance on the SAT9 math (r = .61) and reading (r = .56) subtests. These data are aggregated at grade level; however, they do suggest that it is important to maintain a conception of personal readiness in our understanding of student adaptation to the press of classroom demands and opportunities.


It is reasonable, for example, to ask to what extent student adaptations are linked to challenge/skill ratios. Most notably, is it possible that anxious and withdrawn students are being asked to function beyond their level of readiness? Are disengaged and distracting students presented with too little challenge? What might we learn from struggling and persistent students whose approach to the frustration of learning difficulty appears to pay off in the long run but not necessarily in the short term? Taken together, student self-monitoring reports and task and test performance relationships suggest that student self-monitoring reports may usefully inform teachers’ instructional decisions and practices.


Table 8. Correlations of Standardized Achievement Scores With Self-Monitoring Factors at Time 1


  

SAT9 scores

Self-Monitoring Time 1

 

SAT9 language

SAT9 math

SAT9 reading

Anxious & Withdrawn

r

-.55*

-.73*

-.68*

Sig.

.04

.00

.01

Good Worker

r

-.29

-.14

-.01

Sig.

.31

.64

.96

Engaged Learner

r

.12

-.14

-.22

Sig.

.69

.64

.45

Disengaged & Distracting

r

.25

.61*

.56*

Sig.

.39

.02

.04

Struggling & Persistent

r

.56*

.66*

.68*

Sig.

.04

.01

.01


N = 14 grades within schools.


SELF-MONITORING AND INSTRUCTIONAL OPPORTUNITY


We consider student self-monitoring reports reasonable representations of student adaptation to classroom demands. We wondered how student adaptation related to instructional opportunity. Student midyear self-monitoring factors were correlated with instructional opportunity factors. Results are displayed in Table 9. Direct Instruction was significantly related to Good Worker (r = .36, p = .041) and was the only instructional opportunity factor significantly related to individual differences in student self-monitoring. Remaining patterns of instructional opportunity and self-monitoring relationships, however, suggest considerable complexity within a community of learners. For example, Review and Direct Instruction opportunities appear to be a good fit for Anxious and Withdrawn and Disengaged and Distracting students, yet the greater task ambiguity in the cognitive demands of Structured Problem-Solving present some difficulty for both, and potentially, in turn, for their teachers.


Struggling and Persistent students are of particular interest. The discrepancy between their performance on classroom-like tasks and the SAT9 subtests suggests a certain delay of gratification in their adaptation to difficult learning. The patterns associated with these students’ adaptations to instructional opportunity provide some indication of how their learning and persistence may be co-regulated. Struggling and Persistent students appear to resonate to both the cognitive challenges and teacher supports that comprise Guided Elaboration and the learning reinforcements of Review.


We have framed instructional opportunity as an array of teacher-centered instruction in which teachers align the cognitive demands of lessons and tasks with instructional strategies. An alternative conception might be instructional opportunities as an array of teacher adaptations to students to maintain their efforts to learn (Corno, 2008; T. Good, personal communication, October 2006). For example, teachers may well devote relatively more time to Direct Instruction and Review in part because two of the more “visible” differences in student adaptation difficulty, Anxious and Withdrawn and Disengaged and Distracting, appear to profit from them. An alternative perspective on the relationships between instructional opportunity and the Struggling and Persistent student, for example, might explore the extent to which teachers diagnostically alternate Guided Elaboration with Review to keep effortful yet struggling students on track in demanding learning. On track in their learning and on track in their motivation: Review clarifies and reinforces prior learning and prior learning attempts. Review reassures even as it provides a respite from the frustration of difficulty. How, when, and why teachers decide to alternate instructional approaches within teacher-centered instruction and how those changing opportunities are mediated by students (e.g., teacher is disappointed in us; this way takes too long to figure it out) seems an especially important area for research on co-regulation of classroom learning.


Table 9. Correlations of Instructional Opportunity Factors With Self-Monitoring Factors at Time 1


 Sig.

 Sig.

 Sig.

 Sig.

 Sig.

 

Instructional Opportunity Factors

 Self-Monitoring Time 1

Guided Elaboration

 Direct Instruction

Review

Structured Problem-Solving

Anxious & Withdrawn

r

.17

-.18

-.23

.12

 

.33

.31

.20

.51

Good Worker

r

-.10

 .36*

-.17

-.04

 

.59

.04

.34

.82

Engaged Learner

r

.03

-.13

-.07

.06

 

.87

.48

.69

.74

Distracting & Disengaged

r

.06

-.10

-.28

.18

 

.73

.57

.12

.33

Struggling & Persistent

r

.21

-.07

.28

-.11

 

.25

.68

.12

.56


N = 33 classrooms.


CLOSING COMMENTS


We have reflected on our findings throughout the results section; thus, we do not review or summarize their implications for research on or enactment of co-regulation of classroom practices here. Instead, we attend to the utility of a co-regulation perspective for understanding these data and framing their implications.


First, the model of co-regulated learning (McCaslin, 2009) posits that learners (1) are social by nature (biological adaptations) and by nurture (socialization) (Baumeister & Leary, 1995; Geary, 2002; Olson & McCaslin, 2008), (2) have basic needs for participation and validation that can inform student dispositions toward school (McCaslin & Burross, 2008), and (3) differ in how and in what they participate—in adaptation. Second, in the model of co-regulated learning, individual differences among students that are expressed in classrooms are a function of the relationships and multiple presses among cultural, social, and personal sources of influence. Briefly, personal sources are framed as one’s potential, social sources are practicable opportunities and relationships that can and do influence daily experiences, and cultural sources are norms and challenges that inform probable outcomes. In this study, students were challenged by family poverty and mobility; cultural beliefs about them are evident in the Title I definition of poverty students as “low ability” learners. Students attended participating schools that were engaged in school reform more and less successfully, as evaluated by the state. Thus, cultural expectations for these students and the sociocultural regulation of their schools cohere in the message: (You do not) measure up (McCaslin & Good, 2008).


In a co-regulation perspective, student participation and validation in classrooms inform their adaptive learning and their identity. Thus, the sociocultural context of the classroom matters (see Lopez, 2008). In this study, however, we focused particular attention on the relationships among the social and personal sources of influences expressed in classrooms. We asked, within a community of learners in which students are observed to be similarly on task most of the time, do students differ in how they participate in what opportunities? We found that student participation is informed by consideration of individual differences in student adaptation to different classroom demands (which we represented with an array of teacher-centered instructional opportunities, classroom-like tasks, and annual standardized test performance). Differences among students are important to recognize and respond to—deliberately co-regulate—if the opportunities and relationships in classrooms—the practicable—that can support and promote student adaptive learning potential are to be realized.


The practicable opportunities in our study that most aligned with cultural demands for improving these students’ performance on mandated tests was a basic form of direct instruction. Direct Instruction was positively associated with most student self-monitoring factors; it was significantly related to Good Worker. This student adaptation was manifested positively in classroom-like tasks; however, Good Worker was not positively associated with student performance on the SAT9 subtests. This suggests that Direct Instruction casts a wide safety net, including students who are and who are not yet ready to profit from this mode of instruction as expressed by mandated test performance. Students who are not ready for culturally mandated performances are nonetheless acquiring desirable adaptations to learning challenges that are co-regulated by Direct Instruction opportunities.


Unfortunately, these students and their peers remain largely invisible to sociocultural policy makers who portray them in negative, monolithic ways as uninvested in, if not resistant to, school learning (e.g., National Commission on Excellence in Education, 1983; NCLB, 2002; Tomlinson, 1993). Our study indicates that, at least in Grades 3–5, this portrayal is not accurate. Individual differences in student self-monitoring reports reveal ways in which student adaptations make school personally meaningful. Our findings inform three questions that highlight the relationships and press among personal, social, and cultural influences on classrooms that we think are important to study. First, how long can and will students continue to positively participate in and adapt to classroom demands without sociocultural validation of that participation? Second, how might teachers adapt instructional opportunities to strategically co-regulate and validate the adaptive and curricular learning of students who differ in their adaptation to classroom demands? Third, how might citizens—for example, students, parents, educators, and employers—organize to participate in cultural definitions of the learners these students are and wish to become? Two sets of cultural beliefs and regulations seem especially important to examine. First are those that equate student performance on mandated tests with meaningful learning, a prepared future citizenry, and the effectiveness of the public school. Second are practices that ignore, if not deny, the role that resources, and thus funding policies, play in the equality of educational opportunity to learn.


Finally, the co-regulation model of emergent identity has considerable intellectual debts to acknowledge. First, sociocultural theory that stresses the social origins of higher psychological processes, most notably that espoused by Vygotsky (1962, 1978) and elaborated by Wertsch (1985), has had a major impact on the development of the co-regulation model and the formation of individual differences within communities of learners. Second, the influences of the Carroll Model of School Learning and equality of educational opportunity to learn (Carroll, 1963) and the findings of those who have conducted research on teacher-centered instruction in the elementary grades (e.g., Berliner, Fisher, Filby, & Marilave, 1978; Brophy & Good, 1986; Gage, 1978) are evident in our representation of the practicable in classroom opportunities. Third, we hope that our perspective, which attends as well to the multiple presses on and co-regulation of student adaptations, has contributed to understanding and appreciation of opportunity and its essential role in the promotion of adaptive learning and the realization of student potential.


Acknowledgments


This research was supported by the U.S. Office of Educational Research and Improvement (OERI) Grant No. R306S000033. The authors take full responsibility for the work, and no endorsement from OERI should be assumed. We acknowledge and thank the many individuals who supported this effort, in particular, Tom Good, Amanda Bozack (p.k.a. Amanda Rabidue), Caroline R. H. Wiley (p.k.a. Caroline Hummel), Jizhi Zhang, and Rena Cuizan-Garcia. Thanks also to Barak Rosenshine, Paul Schutz, and two anonymous reviewers for their valuable critiques of earlier versions of this manuscript. We are indebted to the principals, teachers, and students who participated in this study; we hope our efforts prove worthy of their support.


References


Arnold, L. S. (2005). Enhancing student academic regulatory processes: A study of metacognitive knowledge monitoring, strategic enactment, and achievement. Unpublished doctoral dissertation, University of Sydney, Sydney, Australia.


Aspelmeier, J. E., & Pierce, T. W. (2009).  SPSS: A user-friendly approach. New York: Worth.


Ausubel, D., & Fitzgerald, D. (1961). Meaningful learning and retention: Interpersonal cognitive variables. Review of Educational Research, 31, 500–510.


Baumeister, R. F., & Leary, M. R. (1995). The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychological Bulletin, 117, 497–529.


Berliner, D., Fisher, C., Filby, N., & Marilave, R. (1978). Executive Summary of Beginning Teacher Evaluation Study. San Francisco: Far West Laboratory.


Bozack, A. R., & Hummel, C. R. (2005). CSR student motivation story writing: Technical coding manual. Unpublished manuscript, University of Arizona, Tucson.


Brophy, J. E., & Good, T. L. (1986). Teacher behavior and student achievement. In M. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 328–375). New York: Macmillan.


Carroll, J. B. (1963). A model of school learning. Teachers College Record, 64, 723–733.


Corno, L. (2008). On teaching adaptively. Educational Psychologist, 43, 161–173.


Erikson, E. (1968). Identity: Youth and crisis. New York: Norton.


Gage, N. L. (1978). The scientific basis of the art of teaching. New York: Teachers College Press.


Gagne, R. M. (1985). The conditions of learning (4th ed.). New York: Holt, Rinehart and Winston.


Geary, D. C. (2002). Principles of evolutionary educational psychology. Learning and Individual Differences, 12, 317–345.


Lopez, F. (2008). Educational policy and scholastic competence among English language learners. Unpublished doctoral dissertation, University of Arizona, Tucson.


McCaslin, M. (2009). Co-regulation of student motivation and emergent identity. Educational Psychologist, 44, 137–146.


McCaslin, M., Bozack, A. R., Napoleon, L., Thomas, A., Vasquez, V., Wayman, V., et al. (2006). Self-regulated learning and classroom management: Theory, research, and considerations for classroom practice. In C. Evertson & C. Weinstein (Eds.), Handbook of classroom management:

Research, practice and contemporary issues (pp. 223–252). Mahwah, NJ: Erlbaum.


McCaslin, M., & Burross, H. L. (2008). Student motivational dynamics. Teachers College Record, 110, 2452–2463.


McCaslin, M., & Good, T. L. (2005). Theoretical analysis and implementation: A study of Comprehensive School Reform programs in Arizona 2000–2005. Vol. II: Classroom practices and student motivational dynamics [Final report]. Unpublished manuscript, University of Arizona, Tucson.


McCaslin, M., & Good, T. L. (Eds.). (2008) School reform matters [Special issue]. Teachers College Record, 110(11).


McCaslin, M., Good, T. L., Nichols, S., Zhang, J., Wiley, C. R. H., Bozack, A. R., et al. (2006). Comprehensive school reform: An observational study of teaching in Grades 3 to 5. Elementary School Journal, 106, 313–331.


McCaslin, M., & Murdock, T. (1991). The emergent interaction of home and school in the development of students’ adaptive learning. In M. Maehr & P. Pintrich (Eds.), Advances in motivation and achievement (Vol. 7, pp. 213–259). Greenwich, CT: JAI Press.


McCaslin, M., Tuck, D., Wiard, A., Brown, B., LaPage, J., & Pyle, J. (1994). Gender composition and small-group learning in fourth-grade mathematics. Elementary School Journal, 94, 467–482.


National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Government Printing Office.


Neisser, U. (1982). Memory observed: Remembering in natural contexts. San Francisco: W. H. Freeman.


Nichols, S., McCaslin, M., & Good, T. L. (2003). The Comprehensive School Reform Classroom Observation System (CSRCOS). Unpublished manuscript, University of Arizona, Tucson.


No Child Left Behind Act of 2001, Pub. L. No. 107•110, 115 Stat. 1425 (2002).


Olson, A. M., & McCaslin, M. (2008). Students as social beings. In T. Good (Ed.), 21st century education: A reference handbook (Vol. 2, pp. 87–96). Thousand Oaks, CA: Sage.


Pianta, R. C., Belsky, J., Houts, R., & Morrison, F. (March). Teaching: Opportunities to learn in America’s elementary classrooms. Science, 315(5820), 1795–1796.


Rohrkemper, M., & Corno, L. (1988). Success and failure on classroom tasks: Adaptive learning and classroom teaching. Elementary School Journal, 88, 299–312.


Rohrkemper, M., Slavin, R., & McCauley, K. (1983, April). Investigating students' perceptions of cognitive strategies as learning tools. Paper presented as part of the Students' Cognitive Processes in Learning from Teaching symposium at the annual meetings of the American Educational Research Association, Montreal, Quebec, Canada.


Rosenshine, B., & Stevens, R. (1986). Teaching functions. In M. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 376–391). New York: Macmillan.


Tobias, S., & Everson, H. T. (1996). Assessing metacognitive knowledge monitoring. New York: College Board Publications.


Tomlinson, T. (Ed.). (1993). Hard work and higher expectations (National Society for the Study of Education Contemporary Educational Issues series). San Francisco: McCutchan.


Vygotsky, L. S. (1962). Thought and language. Cambridge, MA: MIT Press.


Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.


Wertsch, J. (Ed.). (1985). Culture, communication, and cognition: Vygotskian perspectives. New York: Cambridge University Press.


Wertsch, J., & Stone, C. (1985). The concept of internalization in Vygotsky’s account of the genesis of higher mental functions. In J. Wertsch (Ed.), Culture, communication, and cognition: Vygotskian perspectives (pp. 162–182). New York: Cambridge University Press.


 Winne, P. H. (2001). Self-regulated learning viewed from models of information processing. In B. J. Zimmerman & D. H. Schunk (Eds.), Self-regulated learning and academic  achievement: Theoretical perspectives (2nd ed., pp. 153–190). Mahwah, NJ: Erlbaum.


Zimmerman, B. J., & Schunk, D. H. (Eds.). (1989). Self-regulated learning and academic achievement: Theoretical perspectives. New York: Springer-Verlag.


Zimmerman, B. J., & Schunk, D. H. (Eds.). (2001). Self-regulated learning and academic achievement: Theoretical perspectives (2nd ed.). Mahwah, NJ: Erlbaum.




Cite This Article as: Teachers College Record Volume 113 Number 2, 2011, p. 325-349
https://www.tcrecord.org ID Number: 15979, Date Accessed: 12/4/2021 8:40:15 PM

Purchase Reprint Rights for this article or review