Subscribe Today
Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

What Teachers Know and Do About Assessing Students’ Self-Regulated Learning

by Tova Michalsky - 2017

To foster their students’ self-regulated learning (SRL), teachers’ lesson goals should include not only SRL teaching but also assessment of students’ SRL behaviors/processes to verify whether teachers achieved their goals. This article presents a study that sought to examine teachers’ knowledge about SRL assessment, actual assessment of SRL in their classrooms, and the predictive value of teachers’ knowledge on their behavior. High school teachers in Israel (N = 236) completed self-report questionnaires on knowledge and behavior for implementing SRL assessment in their classrooms. Results indicated that teachers knew more about offline assessment of SRL aptitude than about online or offline assessments of SRL events. Not all SRL assessment methods were well known to all teachers. Furthermore, teachers’ knowledge differed from their self-reported behavior in assessing their students’ SRL. Findings indicated how teacher education could support teachers to learn how to assess SRL effectively, in line with clear recommendations in the empirical literature regarding the value of assessment alongside curriculum and pedagogy.

Research on the promotion of self-regulated learning (SRL) has revealed that students can learn how to self-regulate their learning but that teachers produce weaker SRL training effects than do researchers for students in primary school (e.g., Dignath & Büttner, 2008) and in secondary school (e.g., Michalsky, 2013, 2014). Studies of teachers attempts to foster students SRL have shown that, on the one hand, teachers do tend to arrange learning environments in order to promote students freedom to self-regulate, and they do teach SRL strategies (mostly implicitly) during their lessons, but on the other hand, in most cases, teachers do not assess how students use the offered SRL strategies in actuality and how students handle the strategy in different learning environments (see Bolhuis & Voeten, 2001; Dignath & Büttner, 2008). Both the instruction of SRL and the assessment of student SRL were found to predict students development of self-regulation (Hoskins & Fredriksson, 2008; Organisation for Economic Co-operation and Development [OECD], 2014). Researchers (e.g., Black & Wiliam, 2009; Taras, 2002) have emphasized the interdependence of assessment, curriculum, and pedagogy, asserting that evaluation and analysis of students frequency and quality of learning behaviors (i.e., ongoing assessments) can help to inform future teaching, increase instructions effectiveness, and create more valuable learning experiences (Antoniou & Kyriakides, 2013; Bennett, 2011). Thus, assessment helps guide teachers in their instruction so that they can adjust their planning to better see what children need to learn (James & Pollard, 2011).

The apparent gaps between teachers efforts to engage in SRL-promoting instruction and their infrequent implementation of SRL assessments (e.g., Bolhuis & Voeten, 2001; Dignath & Büttner, 2008) call for further empirical scrutiny. Do teachers lack sufficient knowledge as to how to measure their students self-regulation, or do they simply choose not to perform such assessments, perhaps because such activity is not deemed important? The current article presents a research study propelled by these questions, with the goal of investigating the relations between teachers knowledge about SRL assessment and teachers actual experiences in assessing self-regulation among their students.


Assessment is defined as the process of gathering, recording, interpreting, using and reporting information about a students progress and achievements in developing knowledge, skills and attitudes (National Council for Curriculum and Assessment [NCCA], 2007, p. 7). Student assessment has become central to efforts seeking to impact and improve students learning in schools (Crossouard, 2011), specifically in high school contexts (Noonan & Duncan, 2005). Assessment is recognized as a key part of the teaching and learning cycle in education contexts (Lonka, Olkinuora, & Mäkinen, 2004; Shepard, 2000; Wiliam, Lee, Harrison, & Black, 2004; Wright & van der Mars, 2004), where it can be used as a measure of accountability to inform children, parents, colleagues, and other stakeholders as to the appropriateness and effectiveness of an education program or work unit (Guskey, 2015).

Involving a variety of practices, assessments range from formative to summative techniques and may include considerations of assessments for learning and of learning (Black, 2005; Black, Harrison, Lee, Marshall, & Wiliam, 2003; Cousins, Bruttriss, Callander, & Rouse, 2004). Assessment for learning can occur at all stages of the learning process, where a teacher uses evidence on an ongoing basis to support teaching and learning. Assessment of learning is often separate from the teaching and learning process and falls within a measurement paradigm that focuses on more formal external examinations (Jones, 2010; Shepard, 2000). Formative assessment is defined as frequent, interactive assessments of student progress and understanding to identify learning needs and adjust teaching appropriately (Centre for Educational Research and Innovation [CERI], 2005, p. 21). At the high school level, a combination of formative and summative assessments is recommended, using a wide range of assessment strategies (NCCA, 2009), with specific guidelines provided on the use of teacher observation, teacher designed tasks, and curriculum profiles in high school education (OECD, 2014). However, reviews of overall high school curriculum implementation (NCCA, 2009; Jones, 2010) have highlighted that teachers have difficulty finding time to conduct assessments in an overloaded curriculum and have particular difficulty in assessing practical skill areas like SRL.


To support students efforts to become lifelong self-regulated learners, valid instruments are needed, first to assess the self-regulation strategies that students use while learning (Boekaerts & Cascallar, 2006; OECD, 2014; Shepard, 2000), and second to assess teachers knowledge about such SRL instruments sand how to use them (OECD, 2003, 2014). To offer support to teachers, the OECD (2014) published detailed guidelines on assessment, which highlighted the importance of using a range of assessment strategies. The extent to which high school teachers are currently using these SRL assessment guidelines is unclear at present. Although these guidelines were distributed to schools, there has been no formal national follow-up initiative to support the continuing professional development of teachers to use these guidelines in their teaching (NCCA, 2009; OECD, 2014). This seems to reflect Hall and Kavanaghs (2002) criticism that the current curriculum assessment policy has been too reliant on both teacher knowledge of, and teacher willingness to use, good assessment procedures without the explicit, systematic teacher development program required to sustain it.

Following Pintrich (2000), the research study on assessment that is presented in the current chapter defined SRL as an active, constructive process whereby learners set goals for their learning and attempt to monitor, regulate and control their cognition, motivation, and behaviour, guided and constrained by their goals and contextual features in the environment (p. 453). There is considerable agreement about the importance of SRL, but there has been disagreement about how it can be operationalized and measured in a scientifically useful way (Alexander, 2008; Boekaerts & Corno, 2005; Zimmerman, 2000). As shown by Boekaerts and Cornos (2005) review, the concept of SRL was initially viewed as a stable individual characteristic resulting in decontextualized traitlike measurements. In reaction to this static view of SRL as an aptitude, scholars started to develop new conceptualizations of SRL by using a situated learning approach, in which SRL is viewed as a set of dynamic context-dependent activities. Following this situated learning approach, additional qualitative and ecologically valid instruments have been developed to measure SRL in real time (Boekaerts & Cascallar, 2006; Boekaerts & Corno, 2005; Butler, Beckingham, & Lauscher, 2005; Cascallar, Boekaerts, & Costigan, 2006; Perry, 2002).

In current instruments, these two different operationalizations of the concept of SRL can still be recognized. Winne and Perry (2000) made a distinction between instruments that measure SRL as an aptitude (the decontextualized operationalization) and instruments that measure SRL as an event (the situated operationalization). An SRL event instrument describes the learners regulation activities during a specific task. When SRL is measured as an aptitude, a single measurement is used to identify a relatively enduring attribute of a person.

In addition to instruments distinctions based on the operationalization of SRL, Van Hout Wolters (2000) showed how instruments are also divided into online and offline methods. This distinction is related to the moment SRL is measured. Online methods measure SRL during the learning task, whereas offline methods measure SRL independently from or directly after a learning task.

Table 1 presents the classifications of instrument types mentioned in several meta-analyses (Boekaerts & Corno, 2005; Van Hout Wolters, 2000; Van Hout Wolters, Simons, & Volet, 2000; Winne & Perry, 2000), according to these distinctions. According to Boekaerts and Cascallar (2006), a combination of online and offline as well as aptitude and event instruments is essential to tap the various aspects of students developing skills in self-regulating their learning and motivation processes. A combination of different SRL assessment tools allows researchers and teachers to more fully capture what students think, feel, and undertake in order to steer and direct their learning and motivation in any given domain. Repeated administration of multiple tools over time also provides insight into how students attempts at self-regulation change as a function of (a) their own perceptions of progress in skill development, (b) their changing beliefs about learning and self-regulation in a given domain, and (c) their changing psychological needs.

Table 1. Classification of Different Instrument Types for Assessing SRL







General self-report questionnaires


General oral interviews


General teacher observation


Think-aloud methods

Stimulated recall interviews

Eye-movement registration

Portfolios and diaries/logs

Observation and video-registration of behavior

Task-based questionnaire or interview

Performance assessment through concrete study tasks, situational manipulations, or error detection tasks

Hypothetical task interview

Computer trace log analysis


Note. Based on Boekaerts & Corno, 2005; Van Hout Wolters, 2000; Van Hout Wolters, Simons, & Volet, 2000; Winne & Perry, 2000.


Keeping in mind the paucity of prior research, this study aimed to shed light on high school teachers knowledge about SRL assessment, the teachers self-reported actual practices in assessing students SRL in their classrooms, and the associations between teacher knowledge and teacher behavior. Namely, this study undertook three aims: (1) Drawing on the results of prior SRL assessment research that investigated online assessments of SRL as an event and offline assessments of SRL as an event or an aptitude (e.g., Bolhuis & Voeten, 2001; Perry, Hutchinson, & Thauberger, 2008), the current study used mixed open-ended and closed instruments to explore teachers knowledge and attributions of importance to online/offline event/aptitude assessments. Thus, this study examined whether teachers would assign more importance to the online assessment of SRL as event, to the offline assessment of SRL as an event, or to the offline assessment of SRL as an aptitude, when asked about their knowledge of effective SRL assessments. (2) This study also explored teachers self-reported actual implementation of those online/offline event/aptitude SRL assessment components in their classrooms (teacher behavior). (3) Finally, this study investigated whether teachers attributions of effectiveness to online/offline event/aptitude assessment components (teacher knowledge) would predict the teachers self-reported actual implementation of those SRL assessment components in their classrooms (teacher behavior).



Questionnaires were distributed to a convenience sample of 372 high schools in Israel and were completed by 236 teachers of Grades 1012. The sample included 182 female participants and 54 male participants, yielding a slightly higher percentage of male teachers (22.9%) than the national percentage (18.5%). Teachers ages ranged from 28 to 57 years (M = 42.5), covering all possible age groups of high school teachers. Teachers work experience as a high school teacher ranged from 0.5 to 40 years (M = 15.0). Teachers were told that they were participating in a study on SRL assessment knowledge and practice. The sample might represent a group of teachers who seemed to be interested in the topic of SRL, although no information is available on whether all the teachers in the sample were highly motivated for SRL in general.


Teachers Self-Reported Knowledge About SRL Assessment

This questionnaire comprised two open-ended items followed by eight closed items. To avoid influencing teachers responses, teacher knowledge was first measured qualitatively using two open-ended questions without any given answer categories. The first question tapping teachers knowledge about SRL assessment, What is the best way to assess the learning to learn behaviors of students? Why? (Lonka et al., 2004), used the learning to learn term because it also involves the concept of SRL (e.g., Hofer & Yu, 2003; Winne, 1997) but is more familiar to practitioners than the SRL term. The second question, How do you define the phrase: assessment of learning to learn? aimed to capture teachers knowledge (conceptions and misconceptions) about SRL assessment. Teachers responded in writing to these two open-ended items, and all responses were transcribed and coded for data analysis. For the first item on the optimal assessment methods, we developed a coding scheme based on Boekaerts and Cascallars (2006) online/offline aptitude/event model of SRL assessment. Thus, responses could be coded as an online event assessment, an offline event assessment, or an offline aptitude assessment. For the second item on the definition of SRL assessments, we coded teachers responses into two categories: assessments that focused on students SRL strategies, or assessments that focused on students domain-related learning achievements.

Teachers knowledge about SRL assessments was next measured quantitatively using an eight-item scale based on Boekaerts and Cascallars (2006) online/offline aptitude/event instruments for SRL assessment. Each item (e.g., When learning to learn is assessed, this is done during the learning task by asking the students interview questions about their learning strategy) was rated on a 5-point Likert scale ranging from Do not agree at all (1) to Agree very much (5). The reliability for these eight items on SRL assessment knowledge was α = .76.

Teachers’ Self-Reported Actual Assessment of Students’ SRL

As above, to avoid influencing teachers’ responses, teacher behavior was measured qualitatively using an open-ended question without any given answer categories. This open-ended item, What do you do in order to assess your students learning to learn in your classrooms? was coded using the above coding scheme. Thus, responses could be coded as an online event assessment, an offline event assessment, or an offline aptitude assessment. For example, the following teacher response was coded as an offline assessment of SRL as an aptitude: At the beginning of a learning task, I ask my students if and how they plan to work on the task.



The current studys first research question pertained to teachers attributions of importance to the online or offline, and event or aptitude, assessments of SRL. Regarding the qualitative findings on teacher knowledge, 204 teachers responded to the open-ended item on the best way to assess SRL (32 teachers had missing data). Of these, 153 teachers (75.0%) highlighted aspects of SRL aptitude assessments, such as using SRL questionnaires or interviewing students, whereas 34 teachers (16.7%) highlighted characteristics of SRL event assessments, such as recording and analyzing students work in groups or teacher observations. In addition, 17 teachers (8.3%) highlighted characteristics of both SRL aptitude and SRL event assessments. With regard to the online versus offline differentiation, 127 teachers (62.0%) highlighted only aspects of offline SRL assessments, such as using portfolios and diaries/logs, whereas 24 teachers (12.7%) highlighted only characteristics of online SRL assessments, such as think-aloud methods. In addition, 12 teachers (5.8%) highlighted characteristics of both offline and online SRL assessments.

Regarding the second open-ended item asking teachers to define assessment of learning to learn, 148 (72.5%) teachers focused on assessing students SRL strategies (e.g., To measure how students plan to begin their work on a task: Do they have a very detailed plan? or . . . whether students regulate their time while learning), whereas 56 (27.4%)  teachers focused on assessing domain-related learning achievements (e.g., measuring how students succeed in the test or asking students to write down several strategies that could lead to high grades).

The quantitative findings for the eight-item teacher knowledge scale pertaining to the effective assessment of SRL revealed teachers emphasis on characteristics of offline SRL aptitude assessments, as shown by the mean of 3.42 (SD = .36) on the 5-point rating scale for these four items, based on Boekaerts and Cascallar (2006). Descriptive statistics for the eight-item teacher knowledge scale can be found in Table 2.

Table 2. Teachers Knowledge Based on Boekaerts and Cascallars (2006) Online/Offline Event/Aptitude Assessment Model

Teachers knowledge subscale


M    (SD)



Offline assessment of SRL as an aptitude


3.42    (.36)



Offline assessment of SRL as an event


2.12    (.41)



Online assessment of SRL as an event


1.65    (.41)




The current studys second research question pertained to teachers self-reported incorporation of assessment behaviors into their teaching. The qualitative findings revealed that 59 teachers (25.0%) reported that they did not assess SRL at all in their classrooms. Of the remaining 177 teachers who reported that they did assess SRL, 112 teachers (63.3%) only mentioned their incorporation of offline SRL aptitude assessments (e.g., self-report questionnaires, oral interviews), whereas 50 teachers (28.2%) also reported incorporating offline SRL event assessments as well (e.g., Im asking my students to organize portfolios and diaries/logs). Only 15 teachers (8.5%) also reported incorporating online SRL event assessments in their classrooms.


The current studys third research question investigated the predictive value of teacher knowledge to teacher behavior. Ordered logistic regressions were computed to analyze the predictive impact of knowledge about SRL assessments on teachers self-reported incorporation of SRL assessments into their classrooms. The ordered logistic regressions were selected (Hardin & Hilbe, 2007) because the dependent variable (teacher behavior) was coded according to four response categories that reproduced ordered levels (no SRL assessment incorporated/offline SRL aptitude assessment only/offline SRL aptitude assessment only/both online and offline SRL event assessment). The regression analysis for each teacher behavior predictor controlled for the other two knowledge predictors (e.g., the analysis for actual incorporation of offline SRL aptitude assessment in the classroom controlled for teachers knowledge about offline and online SRL event assessment). As seen in Table 3, these regression analyses revealed that only teachers knowledge about offline assessments of SRL as an aptitude predicted teachers assessment behaviors (B = 2.54, p < .05). Teachers knowledge about online and offline SRL event assessments did not predict teacher behavior significantly.

Table 3. Multiple-Ordered Logistic Regression of Teacher Knowledge as Predicting Teachers Actual Assessment Behavior








Teacher knowledge about offline SRL aptitude assessment






2. Teacher knowledge about offline SRL event assessment






3. Teacher knowledge about online SRL event assessment






In line with the international call for valid instruments to assess students self-regulation strategies while learning (Boekaerts & Cascallar, 2006; OECD, 2014) and to assess teachers knowledge about such SRL instruments and how to use them (OECD, 2003, 2014), the study presented here examined high school teachers knowledge about SRL assessments, teachers self-reported incorporation of such assessments in their classrooms, and the predictive value of such knowledge on teachers actual implementation of assessments. The current qualitative and quantitative findings on teachers assessment knowledge demonstrated that teachers reported knowing more about offline assessments of SRL aptitude, which measure students traitlike abilities, than about contextualized assessments of SRL events, whether measured during tasks or separately. This finding coincides with prior research on teachers knowledge about SRL assessment, where teachers suggested that students SRL should be assessed mainly for offline aptitude skills (e.g., Boekaerts & Cascallar, 2006; Bolhuis & Voeten, 2001).

The current results regarding teachers knowledge about assessment indicate that Israeli high school teachers have already accepted the idea of assessing SRL as a means to foster students self-regulation, thereby incorporating SRL assessment into their conceptions of teaching. However, most of the studied teachers reported knowledge about offline SRL aptitude assessments like general interviews or self-reports, whereas fewer teachers revealed knowledge about situated assessments of SRL events like offline stimulated recall or task-based interviews, or online think-aloud or eye-movement methods. These results are not surprising, considering that a number of teacher education frameworks address offline assessment (McNamara & Brown, 2009), but it is difficult to find more generic frameworks that can sufficiently guide teachers development of online context-dependent assessments of SRL strategies.

With regard to teachers SRL assessment behaviors in actual classrooms, the current results indicated that teachers reported using offline SRL aptitude assessments more frequently than offline or online SRL event assessments. These self-reports substantiated previous classroom observation studies on SRL promotion. Prior research demonstrated that teachers do create learning environments that allow students to self-regulate; however, teachers mostly do not provide students with the necessary learning strategies (Bolhuis & Voeten, 2001; Dignath & Büttner, 2008).

Concerning the predictive value of teacher knowledge, the current regression analyses showed that only teachers knowledge about offline SRL aptitude assessments (while controlling for the other two predictors) served to predict teachers actual assessment behavior in the classroom. This finding is consistent with Lombaerts, Engels, and van Braak (2009), who also found that only teachers knowledge of offline assessment of SRL aptitude predicted teachers assessment practices.

An interesting point that requires further investigation is the gap found between the number of teachers who knew about SRL assessments versus the number of teachers who reported actually implementing such SRL assessment practices (for aptitude: n = 153 vs. 112; for events: n = 53 vs. 36, respectively). Namely, although most teachers (75%) knew about aptitude assessments of SRL and viewed them as important, the majority of those who had such knowledge (53%) did not report integrating assessments of SRL aptitude into their actual teaching. The reasons for this incongruence require further empirical exploration, especially considering that greater knowledge did positively predict more implementation in practice. Perhaps teachers held misconceptions about how to administer such tools effectively to assess SRL. Limited or improper theoretical understanding among teachers could lead to frustration among students as well as among teachers about assessment of SRL (see Kramarski & Revach, 2009).  


The presented study is subject to some limitations. The small sample size does not allow generalization to high school teachers at large. Moreover, the teachers in this sample who volunteered to participate were probably already motivated and interested in the topic, possibly representing teachers with greater knowledge about self-regulated and constructivist learning. Thus, assuming that the teachers of this sample may have been more motivated and interested than the average teacher, the current results indicating teachers inadequate SRL assessment knowledge permit the assumption that the general population of high school teachers might reveal even greater knowledge deficiencies about offline SRL aptitude assessments, and especially about offline and online SRL event assessments.

Another study limitation is that teachers knowledge and behavior were measured by means of self-reports. Teachers might be subject to social desirability, wishing to convey more positive knowledge and behaviors than in reality. The current methodology attempted to avoid social desirability by asking open questions to assess teachers knowledge about assessing SRL. Yet, these qualitative items cannot provide ratings. Classroom observations could serve to obtain a more representative picture of teachers behaviors with regard to assessment practices for students self-regulation in the classroom.


This chapter has proposed that self-regulation in learning can be thought of as a key aspect of productive formative  and summative assessments, particularly in relation to formative strategies for frequently and interactively clarifying, sharing, and understanding learning intentions and criteria for success and activating students as owners of their own learning. It may be suggested that the most productive way forward to promote the relations between formative assessment and SRL is to build on the strengths of eachthe practical grounded nature of formative assessment and the theoretical perspectives afforded by SRLfor generating productive conversations between practitioners and researchers.

The results of this study furnish some indication as to possible causes underlying teachers currently low rates of formative SRL assessment practices that use different methods. As such, the current findings offer a starting point for researchers and teacher educators in promoting teachers ability to assess SRL effectively. When looking at teachers conceptions of the assessment of self-regulation among their students, it becomes clear that the area of formative assessment of SRL has somehow escaped teachers minds (or never existed) as a parallel main lesson goal alongside the well-acknowledged goal of developing students autonomy to regulate their own learning. The results illustrate the need to explicitly inform teachers about the value of SRL assessment through theoretical and practical teacher training.

Concerning high school teachers, the current findings suggest that the problem lies not in teachers willingness to assess students self-regulation, but rather in their lack of knowledge on how to assess it effectively. This claim is supported by the fact that the one area where teachers revealed knowledge about SRL assessmentsoffline measures of SRL aptitudeplayed a crucial role in predicting teachers actual assessment of SRL in their classrooms. Helping teachers develop effective ways to integrate assessments of SRL events, too, into their teaching should start by creating awareness of the existence of both aptitude and event methods and of both online and offline measures for assessing self-regulation.  

The current study generally revealed a positive picture about teachers reported knowledge and behavior regarding students SRL assessments. The question remains as to whether these reports genuinely reflect teachers knowledge, or may reflect teachers socially desirable notions about how they should think, thereby calling for further scrutiny using direct observations and other methodologies. In addition, when trying to change teachers practices for assessing SRL to improve the high school teaching and learning cycle, additional variables may affect teacher behavior beyond mere knowledge about promoting it (Dignath & Büttner, 2008). For example, teachers self-efficacy should be a focus of future research inasmuch as teachers may appreciate the idea of assessing SRL, but if they do not feel capable of performing it, they will not integrate it into their teaching, no matter how well known the SRL assessment method. As Kramarski and Revach (2009) concluded, based on the teacher training that they conducted to help teachers integrate SRL into their classrooms, teachers self-efficacy might relate to their professional development, causing teachers to refrain from implementing what they learn during training. Future research should account for teachers self-efficacy and perceived behavioral control when further investigating the association between teacher knowledge, beliefs, and behaviors with regard to fostering assessment of students self-regulation.

Furthermore, with regard to teacher training, one should keep in mind that teacher beliefs also influence the perception of new information (Schechter & Michalsky, 2014). Therefore, it is crucial to be aware of teachers knowledge conceptions and misconceptions when designing training programs. Moreover, training should take place as early as possible for two reasons. First, preservice teachers at the beginning of their career start using traditional teaching methods that they know from their own schooling experiences (Askell-Williams, Lawson, & Skrzypiec, 2012), which likely did not incorporate systematic SRL promotion or assessment. Second, teachers at some point develop their own potentially incorrect beliefs regarding the assessment of self-regulation based on their teaching experiences. These beliefs may then start influencing their perceptions of new information. It is important to change teacher beliefs before this happens.


Alexander, P. A. (2008). Why this and why now? Introduction to the special issue on metacognition, self-regulation, and self-regulated learning. Educational Psychology Review, 20, 369372.

Antoniou, P., & Kyriakides, L. (2013). A dynamic integrated approach to teacher professional development: Impact and sustainability of the effects on improving teacher behavior and student outcomes. Teaching and Teacher Education, 29(1), 112.

Askell-Williams, H., Lawson, M. J., & Skrzypiec, G. (2012). Scaffolding cognitive and metacognitive strategy instruction in regular class lessons. Instructional Science, 40, 413443.

Azevedo, R., & Cromley, J. G. (2004). Does training on self-regulated learning facilitate students' learning with hypermedia? Journal of Educational Psychology, 96(3), 523535.

Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy and Practice, 18(1), 5-25.

Black, P. (2005). Assessment: Friend or foe? In A. Pollard & J. Collins (Eds.), Reflective teaching (pp. 287292). London, England: Continuum.

Black, P., Harrison C., Lee, C., Marshall, B., & Wiliam, D. (2003). Assessment for learning: Putting it into practice. Maidenhead, England: Open University Press.

Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment. Evaluation and Accountability, 21(1), 531.

Boekaerts, M., & Cascallar, M. (2006). How far have we moved toward the integration of theory and practice in self-regulation?  Educational Psychology Review, 18(3), 199210.

Boekaerts, M., & Corno, L. (2005). Self-regulation in the classroom: A perspective on assessment and intervention. Applied Psychology, 54, 199231.

Bolhuis, S., & Voeten, M. J. M. (2001). Toward self-directed learning in secondary schools: What do

teachers do? Teaching and Teacher Education, 17(7), 837855.

Butler, D. L., Beckingham, B., & Lauscher, H. J. N. (2005). Promoting strategic learning by eighth-grade students struggling in mathematics: A report of three case studies. Learning Disabilities Research & Practice, 20(3), 156174.

Cascallar, E., Boekaerts, M., & Costigan, T. (2006). Assessment in the evaluation of self-regulation as a process. Educational Psychology, 18, 297306.

Centre for Educational Research and Innovation. (2005). Formative assessment: Improving learning in secondary classrooms. Paris, France: OECD.

Cousins, L., Bruttriss, J., Callander, A., & Rouse, C. (2004). The assessment handbook. London, England: PFP Publishing.

Crossouard, B. (2011). Using formative assessment to support complex learning in conditions of social adversity. Assessment in Education: Principles, Policy & Practice, 18(1), 5972.

Dignath, C., & Büttner, G. (2008). Components of fostering self-regulated learning among students: A meta-analysis on intervention studies at primary and secondary school level. Metacognition & Learning, 3, 231264.

Guskey, T. R. (2015). On your mark: Challenging the conventions of grading and reporting. Bloomington, IN: Solution Tree.

Hall, K., & Kavanagh, V. (2002). Primary assessment in the Republic of Ireland. Educational Management Administration and Leadership, 30(3), 261274.

Hardin, J., & Hilbe, J. (2007). Generalized linear models and extensions. College Station, TX: Stata Press.

Hofer, B. K., & Yu, S. L. (2003). Teaching self-regulated learning through a learning to learn course. Teaching of Psychology, 1(30), 3033. 

Hoskins, B., & Fredriksson, U. (2008). Learning to learn: What is it and can it be measured? European Commission Joint Research Centre (JRC), Institute for the Protection and Security of the Citizen, Centre for Research on Lifelong Learning (CRELL). Luxembourg: Office for Official Publications of the European Communities.

James, M., & Pollard, A. (2011). TLRPs ten principles for effective pedagogy: Rationale, development, evidence, argument and impact. Research Papers in Education, 26(3), 275328.

Jones, J. (2010). The role of assessment for learning in the management of primary to secondary transition: Implications for language teachers. Language Learning Journal, 38(2), 175191.

Kramarski, B., & Revach, T. (2009). The challenge of self-regulated learning in mathematics teachers professional training. Educational Studies in Mathematics, 72, 379399. 

Lombaerts, K., Engels, N., & van Braak, J. (2009). Determinants of teachers ecognitions of self-regulated learning practices in elementary education. Journal of Educational Research, 3(102), 163174.

Lonka, K., Olkinuora, E., & Mäkinen, J. (2004). Aspects and prospects of measuring studying and learning in higher education. Educational Psychology Review, 16, 301323.

McNamara, J., & Brown, C. (2009). Assessment of online discussion in work-integrated learning. Campus-Wide Information Systems, 26(5), 413423.

Michalsky, T. (2013). Integrating skills and wills instruction in self-regulated science text reading for secondary students. International Journal of Science Education, 13(11), 18461873.

Michalsky, T. (2014). Developing the SRL-PV assessment scheme: Preservice teachers professional vision for teaching self-regulated learning. Studies in Educational Evaluation, 43, 214229.

Michalsky, T. (in press). Promoting pre-service teachers capacity to teach self-regulated learning through analysis of both teachers and students behavior. Computers and Education.

National Council for the Accreditation of Teacher Education. (2002). Professional standards for the accreditation of schools, colleges, and departments of education. Washington, DC: Author.

National Council for Curriculum and Assessment. (2007). Assessment in the primary school curriculum: Guidelines for schools. Dublin, Ireland: National Council for Curriculum and Assessment.

National Council for Curriculum and Assessment. (2009). Senior cycle towards learning: Listening to schools. Dublin, Ireland: National Council for Curriculum and Assessment.

Noonan, B., & Duncan, C. R. (2005). Peer and self-assessment in high schools. Practical Assessment Research & Evaluation, 10, 18.

Organisation for Economic Co-operation and Development. (2003). PISA literacy skills for the world of tomorrow: Further results from PISA 2000. Paris, France: Author.

Organisation for Economic Co-operation and Development. (2014). Education at a glance 2014: OECD indicators. Paris, Author: Author. http://dx.doi.org/10.1787/eag-2014-en

Perry, N. E. (2002). Introduction: Using qualitative methods to enrich understandings of self-regulated learning. Educational Psychologist, 37(1), 13.

Perry, N. E., Hutchinson, L., & Thauberger, C., (2008). Talking about teaching self-regulated learning: Scaffolding student teachers development and use of practices that promote self-regulated learning. International Journal of Educational Research, 2(47), 97108.

Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 451502). San Diego, CA: Academic.

Randi, J. (2004). Teachers as self-regulated learners. Teachers College Record, 106(9), 18251853.

Schechter, C., & Michalsky, T. (2014). Juggling our mindsets: Learning from success as a complementary instructional framework in teacher education. Teachers College Record, 116(2), 148.

Schraw, G. (2010). Measuring self-regulation in computer-based learning environments. Educational Psychologist, 45(4), 258266.

Shepard, L. A. (2000). The role of assessment in a learning culture. American Educational Research Association, 29(7), 414.

Taras, M. (2002). Using assessment for learning and learning from assessment. Assessment and Evaluation in Higher Education, 27(6), 501510.

Van Hout Wolters, B. (2000). Assessing active self-directed learning. In P. R. J. Simons, J. L. v. d. Linden, & T. Duffy (Eds.), New learning (pp. 83101). Dordrecht, The Netherlands: Kluwer Academic.

Van Hout Wolters, B., Simons, P. R. J., & Volet, S. (2000). Active learning: Self-directed learning and independent work. In P. R. J. Simons, J. L. v. d. Linden, & T. Duffy (Eds.), New learning (pp. 2137). Dordrecht, The Netherlands: Kluwer Academic.

Veenman, M. V. J. (2005). The assessment of metacognitive skills: What can be learned from multi-method designs? In C. Artelt & B. Moschner (Eds.), Lernstrategien und metakognition: Implicationen für Forschung und Praxis (pp. 7799). Münster, Germany: Waxmann.

Wiliam, D., Lee, C., Harrison, C., & Black, P. (2004(. Teachers developing assessment for learning: Impact on student achievement. Assessment in Education: Principles, Policy & Practice, 11(1), 4965.

Winne, P. H. (1997). Experimenting to bootstrap self-regulated learning. Journal of Educational Psychology, 3(89), 397410.

Winne, P. H., & Perry, N. E. (2000). Measuring self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 531566). San Diego, CA: Academic Press.

Wright, M. T., & van der Mars, H. (2004). Blending assessment into instruction: Practical applications and meaningful results. Journal of Physical Education, Recreation, and Dance, 75, 2934.

Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational Psychologist, 25, 317.

Zimmerman, B. J. (2000). Self-efficacy: An essential motive to learn. Contemporary Educational Psychology, 25, 8291.

Cite This Article as: Teachers College Record Volume 119 Number 13, 2017, p. 1-16
https://www.tcrecord.org ID Number: 21914, Date Accessed: 10/27/2021 7:28:54 PM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Tova Michalsky
    Bar-Ilan University
    E-mail Author
    TOVA MICHALSKY, PhD, is a senior lecturer serving as head of the Learning and Teaching Sciences Department in the School of Education at Bar-Ilan University in Israel. Her work combines several interrelated academic fields, where she draws on her expertise as an educational researcher and her extensive professional experience in teacher education. She developed the innovative PfS method for evolving pedagogical frameworks to develop preservice teachers’ Professional vision (PV) for Self-regulated learning (SRL) during participation in a preservice academic course entitled “SRL Teaching and Learning Methods.” During their university course, preservice teachers receive explicit PV scaffolding—pop-up prompts clearly indicating both when and what to notice and reason about—while analyzing SRL-teaching events from video cases of expert teachers. Vignettes depict both direct SRL-teaching (explicit and implicit strategy instruction) and indirect SRL-teaching (powerful learning environments). Many websites have been developed in Israel based on the PfS method for teacher education. Moreover, she collaborated with various countries in designing PfS-based teacher education curricula and was awarded second prize in the 2015 USA Competition for Entrepreneurship and Innovation in Teacher Education. Relevant publications: Michalsky, T. (in press). Promoting pre-service teachers’ capacity to teach self-regulated learning through analysis of both teachers’ and students’ behavior. Computers and Education; and Michalsky, T. (2014). Developing the SRL-PV assessment scheme: Preservice teachers’ professional vision for teaching self-regulated learning. Studies in Educational Evaluation, 43, 214–229.
Member Center
In Print
This Month's Issue